.Open Automata and System Theory
. Introduction
Rather than the traditional computer science approach to
automata theory (start with a tuple (Q, d, s0, F, ...))
I prefer to derive automata from simpler
and more general models. The key idea is that of $State.
To this we can the rules by which states change. The most general
one I've seen is the Labelled Transition System ($LTS).
A simple model is due to W Ross Ashby. I call it a
$Mechanism: it has states and transitions that may occur
and change the states.
Each state helps determine the
next state. There are a set of valid transitions. Each defines
a map from the current state to the next one.
A deterministic mechanism has a single transition function
determining an unique next state when given the current state.
The next complication (again following
W Ross Ashby) is the State+Input+Transition+Output system or $SITO.
Here the system receives a string of inputs and produces a string
of outputs. The next state is determined by the current input and
state. The output depends solely on the current state and input.
$Automata are rather like $SITO systems except that they have
no output. Instead they classify the input sequence as acceptable or
not. When the system enters a special set of states then the input
is accepted. These are useful in the theory of languages. $Transducers
(Again Ross Ashby's term) has input, state and output, but the
output depends solely on the current state. When the Transducer has
only a finite set of states, inputs, and outputs this is the
equivalent of a Moore machine in computer science text books.
.See General Systems Theory
is another approach to dynamic modelling.
All these models, so far allow the number of states to be infinite
and to have any structure.
Classical models: $FSA, $TM, $PDA etc limit these.
I plan some notes on a structured model of automata and
a section will be added on $Buchi Automata that model
properties that hold on infinite input strings.
In the 1992, the use of message sequence charts to specify interactions
between parts of a system or protocols became popular. I'm including a section
($Protocols) on a possible syntax with a couple of alternative forms.
. Basis
Automata_and_System_Theory::=Net{
|-$States.
|-$Mechanism.
|-$SITO.
|-$Transducers.
|-
.See General Systems Theory.
|-$Automata.
}=::Automata_and_System_Theory.
. See Also
Unary::=http://www.csci.csusb.edu/dick/maths/math_15_Unary_Algebra.html#Unary.
FFD::=http://www.csci.csusb.edu/dick/maths/math_23_Flow_Diagrams.html#FFD.
digraph::=http://www.csci.csusb.edu/dick/maths/math_22_graphs.html#digraph.
monoid_generated_by::=http://www.csci.csusb.edu/dick/maths/math_33_Monoids.html#monoid_generated_by.
. State
A system, machine, automaton, device, component, object typical is in
one of a number of states at any given time. The set of States determines
the power of the machine, system, automaton: more states give more power.
States::=following,
.Net
State::Sets=`given`, The traditional short hand is to use Q for State.
For q:State.
finite::@= finite(State).
.Close.Net States
Notice that we can construct a set of states by combining other sets: for
example we can imagine a finite program with 3 counters:
State = {q0,q1,q2,q3,q4} >< (Nat0^3).
We can add special states (an initial state for example) and special
subsets of a set of States (final states for example).
. Labeled Transition Systems
THe most general model: a set of states and some labels. Each label
may change that state. Some changes happen with no visible cause or effect
and these are labelled \tau.
LTS::= following,
.Net
|- $States.
Label::Sets=`given`
invisible_action::$Label, also known as a silent action.
\tau::=invisible_action.
Action::=$Label ~{\tau}.
finite::@= finite(State) and finite(Label).
Transition_relations::Label->@(State, State)=`given`.
initial::$State.
T:=Transition_relations, local short hand.
For Label l, T(l) in @(State, State), is a relation,
so expressions like T(l1);T(l2);T(l3) express sequences of changes.
Possible_changes::@(State, State)= |Transition_relations.
For s:State, Fail_in_state(s)::={l:Label. no s':State(s T(l) s').
For s:State, Forbidden(s)::={l:Label. no s':State(s T(l) s').
For s:State, Undefined(s)::={l:Label. all s':State(s T(l) s').
Failures:: @(#Label >< @Label)= { l:#Label, f:@Label. for some s ( initial ;[l1:l]T(l1) s and f in Fail_in_state(s) }.
.Hole
.Close.Net LTS
. Basic Mechanism
Mechanism::=Net{
|- $States.
Transitions::@(State^State)=`given`.
Starting with Next:State->@State is another possibillity.
.Table Example Mechanism 1
.Row State 0 1
.Row Transition[1] 0+>1 1+>0
.Row Transition[2] 0+>0 1+>1
.Close.Table
.Table Example Mechanism 2
.Row State Nat0
.Row Transition[1] s+>2*s
.Row Transition[2] s+>2*s+1
.Close.Table
deterministic::@= |Transitions|=1.
()|-for all t:Transitions( (State, t) in $Unary).
can_change_to::@(State,State)=rel [a,b](for some t:Transition(a=t(b))}.
|~ ::= can_change_to.
Next::=map[s:State]{s'. s can_change_to s'}.
\delta:=Next.
()|-$FFD(Parts=>State, Flow=>(@,or,false), flows=>can_change_to).
()|-$digraph(Nodes=>State, Arcs=>can_change_to).
Monoid::=$monoid_generated_by(Transitions, (;)).
}=::Mechanism.
Transition_System::= $Mechanism and Net{ initial : State ; }.
SITO::=Net{
State and input determine next state and output.
State::Sets=`given`,
S:=State.
Input::Sets=`given`,
I:=Input.
Transition::(S^S)^I=`given`,
d:=Transition,
Output::Sets=`given`,
O:=Output,
output_map::(O^S)^I=`given`,
g:=output_map.
There has been an extensive study of the properties of finite systems:
finite::@.
|-finite = finite(State) and finite(Input) and finite(Output).
We can display a finite SITO in a table with one column per state,
one row per input. The item in row `i` and column `s` shows d(i)(s)/g(i)(s).
.Table Example SITO 1
.Row Input\State 0 1
.Row 1 1/0 0/1
.Row 0 0/0 1/1
.Close.Table
()|-$Mechanism(State=>State, Transitions=>img(Transition)).
This implicitly defines $can_change_to, $Monoid, etc etc in $Mechanism.
}=::SITO.
State_determined_system::=SITO.
. Complex and Compound State Determined Systems
Comonly small $SITO are expressed by tables. But there are two alternatives:
Dynamic predicates and composition. A dynamic predicate that contains an input
s' = Bit(s<>i),
(for example) can express a transition relation or function and the output
can be expressed as a simple equation:
o = s.
The above two equation describe the example above. This technique is very
useful when the the states, inputs and outputs are number, vectors, or
other values in know algebraic systems.
Because the states, inputs, and outputs are defined to be sets we can use set expressions to describe complex systems. For example if there are two inputs we would define
the input set as the Cartesian product of two simpler sets. Similarly with multiple
outputs. When the state is expressed as a Cartesian product as well we are
expressing the systems as a composition of simpler machines. We can then add
constraints to
.Key couple
the inputs and outputs...
.Hole Composition of Systems
. Transducers
Ross Ashby defines a transducer as a $SITO where the output function
depends on the state only:
TRANSDUCER::=$SITO with following
.Net
for all i0,i1:Input ( output_map(i0)=output_map(i1) ).
.Close.Net
. Mealy Machines
MEALY::=$SITO and finite.
.See http://en.wikipedia.org/wiki/Mealy_machine
. Moore Machines
MOORE::=$TRANSDUCER and finite.
.See http://en.wikipedia.org/wiki/Moore_machine
.Open General Systems Theory
A General dynamic system can be modelled using the calculus of relations:
.See http://www.csci.csusb.edu/dick/maths/logic_41_HomogenRelations.html#Paths and Trajectories
for example. Another approach is via a morphism from a group of Times
into changes of state, or via a one-many map indicating the possible next states.
General_dynamic_system::=following,
.Net
|- (st): $additive_group(Time).
State::Sets.
For t:Time, \delta^(t) ::@(State, State).
|- (s+): for all Time t1,t2, \delta^(t1+t2) = \delta^(t1); \delta^(t2).
|- (s0): \delta^(0) = Id.
|- (s-): for all Time t1, \delta^(-t1) = /(\delta^(t1).
.Close.Net General_system
additive_group::=http://www.csci.csusb.edu/dick/maths/math_34_Groups.html#Special groups.
Discrete_Dynamic_System::=following,
.Net
State::Sets=`given`.
\delta::State->@State=`given`.
finite::@,
|-finite = finite(State).
deterministic::@,
|-determinstic = for all s:State, one next(s).
nondeterminstic::@=not deterministic.
infinite::=not finite.
Finite discrete dynamic systems have been used to model complex
software. There exist tools that will test whether or not a
given model has desirable properties and so fits project requirements.
These have been used with some success in industry. It is possible
to express the key ideas with the theory presented here:
Suppose we are interested in a property of some states we could
model it as the subset P of states where it holds:
For P:@State.
For all P, AX(P) ::@State={s. \delta(s) ==> P}, -- In the next step P must be true.
For all P, EX(P) ::@State={s. \delta(s) & P}, -- In the next step P can be true.
()|-EX(P) = / \delta(P).
Notice that AX(P) and EX(P) are defined to be sets of States and so
properties. So we can write AX AX(P) -- P must be true in to steps.
EX AX P -- P can enter a state where P must become true.
From these we can define the 6 "modes" used in the Computational
Tree Logic:
AG(P)::= P & AX( AG(P)), states where P is always true.
EG(P)::= P & EX( EG(P)), states where there is a path where P is always true.
AF(P)::= P | AX( AF(P)), states where P is entered on every path.
EF(P)::= P | EX( EF(P)), states where some path enters P.
AU(p,q) ::= q | (p & AX(AU(p,q))), states where p is true until q becomes true on all paths.
EU(p,q) ::= q | (p & EX(EU(p,q))), states where, on some path, p is true until q is true.
AU and EU both imply that q is ultimately true.
()|- AG(P) = P & AX P & AX AX P &... = &[i:0..]AX^i(P)
()|- EF(P) = P | EX P | EX EX P |... = |[i:0..]EX^i(P)
If the system is finite then there are quite efficient algorithms and
tools for doing the above calculations.
.Close.Net Discrete_Dynamic_System
.Hole
.Close General Systems Theory
.Open Automata Theory
BASIC_AUTOMATA::=following,
.Net
State and input determine next state.
State::Sets=`given`,
S:=State.
finite_state::@.
|- finite_state iff finite(S).
Input::Sets=`given`,
I:=Input.
finite_input::@.
|- finite_input iff finite(I).Q
finite::@=finite_state and finite_input.
.Close.Net BASIC_AUTOMATA
Deterministic_Automata::=$BASIC_AUTOMATA with following,
.Net
Transition::(S^S)^I=`given`,
d:=Transition,
()|-$Mechanism(State=>State, Transitions=>img(Transition)).
This implicitly defines $can_change_to, $Monoid, etc etc in $Mechanism.
For i:#Input, n=length(i), s0, s1: State, s0 |=[i] s[n] ::@= for some s1,s2,...,s[n-1], (s0 d(i0) s1 d(i1) s2 ... d(i[n]) s[n]).
.Close.Net Deterministic_Automata
Automata::=$BASIC_AUTOMATA with following,
.Net
\delta::@(S >< I >< S), transitions for each input.
next::I->@(S,S).
|- for all i, s, s', s next(i) s' iff (s, i, s') in \delta.
\delta::@(S >< #I >< S).
|- for all x,s,s', (s,x,s') in \delta iff s (;next(x)) s'.
()|- for s0,s1:S, (s0, (), s1) in \delta iff s0=s1.
()|- for s0,s':S, x,y:#I, (s0, x!y, s') in \delta iff for some s1( (s0,x,s1) and (s1, y, s') in \delta).
.Close.Net Automata
Acceptor::=$Automata with following,
.Net
Initial::@State=`given`,
Final::@State=`given`,
Accepting::@State=Final.
Language::@#Input= {i:#Input. for some s0:Initial, s1:Accepting(s0|=[i]s1) }
.Close.Net Acceptor
FSA::=Acceptor & finite.
.Hole
PDA::=following,
.Hole
TM::=Turing_Machine,
Turing_Machine::=following,
.Hole
Bounded::=following,
.Hole
.Close Automata Theory
. Eilenberg's ultimately abstract model
.Hole
.Open Automata on Infinite words
. Source
After
. Thomas90
.Set
Automata on Infinite Objects
in $Leeuwen90 pp133-187
.Set
Jan van Leeuwen(ed)
Handbook of Theoretical Computer Science -- Volume B. Formal models and Semantics
MIT Press
Cambridge MA/Elsevier Science Pub Amsterdam The Netherlands CR9209-0659
.List
Pfau Library(4th Floor) QA76.H279 1990
.Close.List
.Close.Set
.Close.Set
. Notation
Alphabet::Finite_Sets,
A::=$Alphabet.
#A::=`finite words over A`.
.See http://www/dick/maths/*strings*html
\omega::=Nat, -- common notation.
Aw::=Nat->A, -- infinite words over A.
Aoo::=A* | Aw.
empty::A*=().
\epsilon::=empty.
For u,v,w:A*, U,V,W:@#A, \alpha, \beta:Aw, L,L':@Aw.
. Subwords
\alpha(m..)::=map[i:Nat](\alpha(i+m-1)).
\alpha(m..n)::=map[i:1..n-m+1](\alpha(i-1+m)).
. Infinite quantifiers
for infinite n(W(n))::= `there exist an infinite number of n such that W(n)`.
for infinite n(W(n))::= for all n, some n'>n (W(n')).
for finite n(W(n))::=`there exist only a finite number of n satisfying W(n)`.
for finite n(W(n))::=for some n, no n'>n (W(n')).
()|- for no n(W) implies for finite n(W(n)).
for some_finite n(W(n))::= for some n(W(n)) and for finite n(W(n)).
. Sets of words
pref(W)::={u:#A. for some v:#A( u v in W }.
W^\omega::={\alpha:Aw. for some w:Nat0->W ( \alpha = ![i=0..](w(i))}.
arrow(W)::={\alpha. for infinite n( \alpha(0..n) in W)}.
lim(W)::=arrow(W), -- alternative notation.
W^\delta::=arrow(W), -- alternative notation.
For \sigma:Nat->S, In(\sigma)::={s:S. for infinite n(\sigma(n)=s)}, -- states that occur infinitely
often in a sequence.
infinity_set(\sigma)::=In(\sigma).
. Buchi Automata
Buchi automata inherit all the structure of finite non-deterministic automata: inputs, states,
iniital_state, final_states, and transitions. However the condition for accepting an infinite string
of symbols is defined and effectively over-rides the definition for finite strings. A normal
automata accepts a finite string by entering a final state at the end of the string. An infinite
string -- by definition, has no end. So, an infinite sequence of inputs is accepted when it allows
the automata to enter the set of final states infinitely often.
BUCHI::=following
.Net
|-AUTOMATA(nondeterministic, finite) hide(accepts, language).
runs::@(Nat->states)={\sigma: Nat->states. \sigma(0)=initial and for all i(\sigma(i)
can_become \sigma(i+1)}.
successful::@runs={\sigma. In(\sigma) & final}.
(successful, final in Finite)|- successful = { \sigma:runs. for some f:final, infinite
n(\sigma(n)=f)}.
For \alpha:@(Nat->inputs), \sigma: Nat->states, \alpha produces \sigma ::= for all i:Nat(
(\sigma(i),\alpha(i), \sigma(i+1)) in transitions).
language::@(Nat->inputs)={\alpha. for some \sigma :successful( \alpha produces \sigma).
s |=[\alpha] s' ::= for some \sigma(\sigma(0)=s and \sigma(len(\alpha))=s' and for i:Nat0(
((\sigma(i),\alpha(i), \sigma(i+1)) in transitions) ).
W[s,s']::={w:A*. s|=[w]s'}.
|- For all s,s', W[s,s'] in regular_set.
L:=language.
|- L = |[s:final](W[q0,s].(W[s,s])^\omega).
.Close.Net
Buchi_recognisable::@@(Nat->A)={L. for some BUCHI A(L=A.language)}.
. Examples
A1:=the BUCHI(input:={a,b,c}, states:={q0,q1}, initial:=q0, final:={q0}, delta:={(q0,b,q0),
(q0,c,q0), (q0,a,q1), (q1,a,q1), (q1,c,q1), (q1, b, q0)} ).
A1.language=`after any occurence of b there is an occurrence of b`.
A1_bar:=the BUCHI(INPut:={a,b,c}, states:={q0,q1}, initial:=q0, final:={q1}, delta:={(q0,b,q0),
.Hole
(q0,c,q0), (q0,a,q0), (q0,a,q0), (q1,a,q1), (q1,c,q1)} ).
|- A1_bar.language= ~A1.language.
A2:=the BUCHI(input:={a,b,c}, states:={q0,q1,q2}, initial:=q0, final:={q0,q1,q2},
delta:={(q0,b,q0), (q0,c,q0), (q0,a,q1), (q1,a,q1), (q1,c,q2), (q1, b, q2), (q2,c,q1), (q2,b,q1)} ).
|- A2.language=`between any two a's there are an even number of b's and c's`.
. Results
Buchi recognisable iff finite union of sets U.V^\omega where U and V are regular.
Buchi recognisable is closed under union, complement, intersection and regular prefix.
\omega_regular::=Buchi_recognisable.
.Close Automata on Infinite Words
.Open Structured Automata
Structured automata are something that I have developed. Their
precursor (I believe) were the flowchart like defined Turing Machines
in Davis, a long while ago. Essentially the behavior of the machine
is defined as a regular relational expression. They can be written
like a program. Here is my notes (from the late 1980s) of how
acceptors, generators and filters can be modelled in MATHS. This is also
argues that the structure of the languages accepted and generated can
determine the structure of the program.
. Data Flows
Model a data flow as a variable with values that are strings of items. Let
U=$ Net{..., x:#X, ..., y:#Y,...z:X,...,R} then MATHS has the following
relations defined for manipulating x and y as input and output streams
STREAMS::=http://www/dick/math_62_Strings.html#STREAMS.
|- $STREAMS.
(STREAMS)|- y:!e = (y'=y!e), -- Put e at end of y.
(STREAMS)|- /(y:!x) = (y = y' ! x'), -- Take some input from the end
of y (nondeterministic).
(STREAMS)|- x?:y = (x ! y' = y and x in Y), -- Take first x from
front of y.
(STREAMS)|- x' ?:y ==> /(x!:y), -- Get next item in y from y
and put it in x.
(STREAMS)|- S on x::= for some c:S, t:#X(x=(c!t)). -- Look ahead for string in
S in front of x
I will assume that each stream have a special item `end` that is used to
indicate the end of streams.
end(x)::= (x = end), -- An explicit end marker, end
not in X.
. Structured Acceptors
Given a
relation `R:@(U,U)` where `U=${a:%A, i:#I,...}` and `A,I :Sets`, then `x
->i(R a)` means that if `R` starts with arguments `a` then it will accept
`x` as an input on `i`,
->i(R a)::@#I,
->i(R a)::={ x:#I || for some ( (a=>a,i=>x! (end))R(i=>(), a=>c)
)},
->(R a)::=->i(R a),
c?::=c:?i,
accepted_by(R)::=->i(R).
()|- For a pair of relations R, S:@(U,U), all following
.List
accepted_by(R) & accepted_by(S) = accepted_by(R & S )
accepted_by(R) | accepted_by(R) = accepted_by(R | S )
accepted_by(R) accepted_by(R) = accepted_by(R ; S )
# accepted_by(R) = accepted_by(do(R) ).
.Close.List
Example
even_parity::=#0 #( 1 #0 1 #0 ).
|-even_parity = accepted_by (do 0?; do( 1?; do 0?; 1?; do 0?) ),
Thus for any regular expression defining a language there is a similar
relation specifying how to recognize it. This extends to context free
grammars and MATHS dictionaries. By induction if a set `S` in `@#I` is
defined by a dictionary with definitions using the regular operations and
`&` then there is an isomorphic dictionary that defines a relation that
accepts `S`. Such dictionaries are fairly easy to program in mosts
languages using Jackson's rules for handling look ahead or backtracking.
Further if we label the input statements and replace `c?` by `(state'=n;
next=c)` we can derive a co-function that handles each character in turn,
one input at a time
.See [Jackson75].
. Structured Generators
Generators are input/output dual to acceptors. Given a relation `R:@(U,U)`
with `U=${a:%A o:#O,...}` with `A,O:Sets`, `x<-(R a)` means that if
`R` starts with arguments `a` and then it can generate `x`:
(R a)o->::@#O,
(R a)o->::={ x:#O || for some c' ( (a=>a,o=>())R(o=>x!(end), a=>c') ) },
<-(R a)::=(R a)o->,
(R a)->::=(R a)o->,
c!=o:!c.
Again there is a natural structural relationship between the set generated
by a generator and the generator:
generated_by(R)::=<-(R) -- the language generated by the R.
For a pair of relations R, S:
<-(R) & <-(S) = <-(R & S )
and <-(R) | <-(S) = <-(R | S )
and <-(R) ; <-(S) = <-(R ; S )
and # (<-(R)) = <-(do(R) )
SO, the structure of a generator can reflect the structure of a specification
of the language. If the specification uses recursion to define language
then the equations indicate a simple nondeterministic generator for the
language. By induction if a set `S` in @#O is defined by a dictionary with
definitions using the regular operations and `&` then there is an
isomorphic dictionary that defines a relation that generates S.
Further if we label the output statements and replace c! by (next=c;
state'=n) we can derive a co-function that produces each character
in turn, one at a time.
Notice that if S is an acceptor and R a generator then
|- (S)-> ; ->(R) = for some x ((S)-> x ->(R)),
= for some x ((S)-> x and x ->(R)),
= some( (S)-> & ->(R) ).
. Structured Filters
A filter is a program with command line arguments A that accepts data on
the standard input I and produces data on the standard output O. Given a
relation `R:@(U,U)` with `U:=${a:%A i:#I, o:#O,...}` and `A,I,
O:Sets`,
For x:#I, y:#O, a:#A,
x ->i(R a)o->y::@(#I, #O)=for some c:#A ( (a=>a,o=>(),i=>x!(end))R(i=>(), o=>y!(end), a=>c) ).
->(R a)->::=->i(R a)o->.
Informally `x ->(R a)->y` means that if `R` starts with arguments `a`
and is supplied with input `x` then it can have produced `y` on its output
when it has stopped. As always a relation does not connect every possible
input with an output.
The ideas of acceptor and generator defined above can also be defined for
filters,
accepted_by(R)::= | [a:A] pre( ->(R a)-> ),
generated_by(R)::=| [a:A] post( ->(R a)-> ).
Here is a familiar example:
kitten::=(while(not end(i))( c'?:i; o:!c)).
|- for all x:#I, x ->(kitten)-> x.
|- ->(kitten)-> = Id(#I)
|- accepted_by(kitten)::= #I,
|- generated_by(kitten)::=#I.
The following are approximations to commands in a well known operating
system.
a:%#char, fopen:#char->#char, I=O=char.
echo::=with{w:#char} ( do (w'?:a; o:!w); a=(); o:!EOLN),
cat::=with {f: #char; c: char; p: #char}
.(
while(not a<>())
.(
f'?:a;
.(
f="-"; kitten
| f<>"-"; p'=fopen(f);
while(not end(p))
.(
c'?:p; o:!c
.)
.)
.)
.)
There is a structural result for filters. It relates the input-output
relationship to the structure of the filter:
For R,S:@(U,U),
( ->(R)-> ) | ( ->(S)-> ) = ( ->(R | S)-> )
and ( ->(R)-> ) & ( ->(S)-> ) = ( ->(R & S)-> )
and ( ->(R)-> ) before ( ->(S)-> ) = ( ->(R ; S)-> ).
For X,Y:@(#I, #O), X before Y ::= { ( i1!i2, o1!o2 )
| (i1,o1) in X and (i2,o2) in Y }.
For example given
input = #( odd_item even_item),
output = #output_item,
then we can combine them to output the even items or the odd items:
input_output::=#( odd_item even_item_output_item),
input_output::=#( odd_item_output_item even_item).
For deatils see Jackson's work "JSP".
Floyd and Jackson both give examples where the correspondences can not form
a grammar. However, for any pair of processes(R and S say) with `i` and
`o` defined in the standard way,
->(R c)->;->(S d)->
specifies the behavior observed by two programs connected by McIlroy's
pipe [Bell Labs 83, RitchieThompson74, Ritchie80, any good UNIX text].
Floyd, Jackson, Hoare and many others (see Chapter 2) have argued that this
decomposition makes it possible to replace a complex sequential program
with two or more simple programs.
. A Powerful MATHS Filter
MATHS defines a powerful filter that applies a relation `R` to strings on
the input and output:
For R:@(#I, #O),
filter_wrt (R)::= do( | [ (x,y):R ] (x'!:i & o:!y ) | no x' in
pre(R) and x'!:i; |[x:I] x'?:i )).
This can be highly nondeterministic but simple cases do interesting things:
filter_wrt( 0+>1 | 1+>0 ) -- complement bits
filter_wrt( (_^2) ) -- square the input numbers
filter_wrt( "a".."z" | "A".."Z" ) -- delete everything but letters
filter_wrt( |[x:I] (#x x+>x) ) -- remove duplicates.
filter_wrt( #char "CSP" #char EOLN ) -- select lines containing CSP.
Thus `filter_wrt` seems to combine `tr`, `grep`, `awk`, and `uniq` from
the UNIX. system.
.Close Structured Automata
. Protocols
A protocol is a useful tool for describing inter-process communications.
A possible syntax
protocol::=#$message.
message::=participant "->" participant ":" data semicolon.
an alternative form of message syntax might be:
participant "sends" message O(|"(" data ")") "to" participant.
Note... it should be possible to generalize to structured protocols which use
regular sets with messages as elements.
Examples of defining the Language/action perspective protocol follow:
lap(Alice, Bob)::=# following
.(
Alice->Bob:request;
Bob->Alice:promise;
Bob->Alice:delivery;
Alice->Bob:acceptance
.).
lap(Alice, Bob)::=#(Alice sends request to Bob; Bob sends promise to Alice; bBob sends delivery to Alice; Alice sends acceptance to Bob).
Any preferences or thoughts
.Hole Protocols
?
.Close Automata and System Theory