A full theoretical treatment is elsewhere (DOMAINS, POSETS). The motivation for this theory is in the need to ground denotational_semantics in a sound piece of mathematics. Denotational semantics uses the "FIX" operator to avoid the recursive definitions found in operational semantics. A cpo or domain is a collection of objects where the FIX operator is guaranteed to produce a proper result.
A fixed point is a value which does not change under some transformation. One simple example is the value 0 under the operation of multiplying by 2. If 0 is multiplied by 2 the result is 0. We say that 0 is a fixed point of the operation or function of multiplying by 2. A more advanced example is the the exponential function(exp) in the calculus. The exponential function is a function that does not change when differentiated:
So exp is a fixed point under differentiation. Here are some physical examples, when you take a base ball and spin it round on some access the points on the axes are all fixed points. If you take a square tissue and lay it on a table, and then crumple it up an put it back where it was then there will be at least one point in the tissue that is vertically above it's old place. Crumpling is an operation that has fixed points!
Quick exercise: When you look up at night and watch the northern sky for several hours, you will see a star that doesn't appear to move, what is it called?
For any operator F:X->X will define
If you are still not sure of what a fixed point is, see
[ Notes on familiar fixed points ]
Theory of Fixed Points
Most books presents the Scott/Plotkin theory that starts from scratch
by defining axioms and notations for objects in a Domain or cpo(Complete Partial Order).
I prefer to assume the whole of Set Theory and Discrete Math and build on that.
We are interested in operations that, when they terminate, change the
State of our computer in a predictable way. So we are interested in a
set of Partial Functions on our state. I will symbolize these Fun for
I use (==>) for the "subset or equal" relation between sets and so for f, g : Fun, f==>g means that g is defined where ever f is and has the same value. So g is just like f but more defined. It carries more information than f.
Further the axioms for sets allow us to derive the properties of f==>g that most books assume:
Further we can assume that Fun contains the completely undefined function -- the worst possible computation that never produces any results:
The standard theory of partial orders(POSETS) defines the least upper bound (lub) operator
in terms of the ordering as the least of all elements that are above a set:
We are interested in a general model of what a happens when a definition is applied many times. So we plan to demonstrate that there are certain kinds of operations on functions that generate increasingly better approximations to a particular function. The operation, in a sense converges to a unique function. We can also show that this unique function is always a fixed point of the operation. We will therefore be interested in the properties of operators on functions. For historical reasons these are called functionals. We will show that some of these functionals, when repeatedly applied to a function generate a series that tends toward a definite limit and that this is a particular fixed point of the functional. It turns out that the functionals that turn up in denotational_semantics have this property.
To be more precise, suppose we are interested
in an operations F:Fun->Fun. Let
In particular we will consider the Kleene series generated by F:Funfun.
For some of these operations F, we will show that the limit is the least fixed point of F:
There are some special kinds of functional (operation on function). One of these is are the strict functionals that don't defined the undefined:
An very important class of operation/functional never removes a defined value from a function to which it is applied:
So when we have an operation on partial functions we will be most interested in will be the least fixed points and so define
We will think of the above as being the limit of the series of functions generated by F.
We will need functionals that are analogous to the continuous functions of the calculus. As their argument tends towards a limit, so does their result:
Standard theory doesn't guarantee the uniqueness of the least upper_bound and it does not guarantee that it is inside the original set!
The problem is that by ??exercise 21??, Fun has subsets with no upper bounds and so no least upper bound inside Fun. We can show that upper bounds exist but they may not be functions -- they may be many-to-many rather than many-to-one.
However we can show that ??Exercises 18 and 19?? the set of all relations on State><State is complete: Every set of relations has an upper bound -- the union of the relations.
We need to show that for the kind of operation F that we use in denotational_semantics that the Keleene series, Kleene(F), is a set that has an upper bound in Fun. For the functionals in denotational semantics it turns out that this set is always what is called a chain.
A chain is a set of functions S so that for any pair f, g: S, either f==>g or g==>f.
Note. Perhaps you could think of these as being like sequences in the theory of the convergence in the real numbers.
Some use a more general set, rather than a chain they work with directed sets. These are sets such that their every finite subset contians its lub.
Further we can show that chains in State<>->State always have an upper bound and so a unique
least upper bound. Indeed it is the relational upper bound any way. So what we have done is
discover that chains maintain functionality or many-oneness in the limit.
They therefore turn chains into chains.
??Several nice exercises.??
What we want to show is that the expressions used in the definition of while will always give us functions that are well defined. In other words we'd like to show that any expression made of cond, o, Id, etc will have a lest fixed point that is a partial fucntion.
We can show them to be monotonic.
Bad news: A monotone function does not carry least elements into least elements.
All we know is that the least upper bound of the image of a set, under a monotone functions is
==> than the image of the least upper bound of the set.
Just like in the calculus, a continuous functional turns limits into limits.
Key step - The Kleene Sequence of a continuous functional has a unique least upper bound...
The general result has some powerful consequences... like the existence of LISP
like data structures!
. . . . . . . . . ( end of section Theory of Fixed Points) <<Contents | Index>>
Existence of Direct Semantics for While
Now we have the theory needed to show that the direct semantics defines a function for each
valid program. The function when applied to a statement returns
a partial function - and so some (all? none?) of its values may not be defined.
The heart of the problem is the definition
Verify the above by algebra if it is not obvious to you.
We know that if F1 and F2 are continuous then so will F.
So we need two lemmas:
In fact we can go further and demonstrate that functions constructed by applying cond and composition are always continuous.
The denotational semantics is in terms of the fixed points of continuous functionals. These have
unique least fixed points -- precisely those that their Kleene sequences converge towards.
. . . . . . . . . ( end of section Existence of Direct Semantics for While) <<Contents | Index>>
To make the presentation simpler if f:S<>->S we write
Normally we say that
I will write
However we do not have any justification(yet) for introducing a pseudo-element undefined into the set of States yet. This justification is developed by using the theory of partially order sets and relations. [ ORDER in math_21_Order ]
The three ways of operating on partial functions are composition,
conditional and fixed point functions:
For more on least elements, uniqueness of least elements, upper
For chains, chain completeness, monotonic functions, continuous functions, ... see
. . . . . . . . . ( end of section Proof of 43.4) <<Contents | Index>>
Proof of 43.5
You are already familiar with the idea of a fixed point from normal arithmetic and algebra. A
fixed point of a function f is any value
of x such that f(x) = x. For example if
You will also notice that in many cases, when you repeat the function to generate a sequence and the sequence converges, then is converges towards a fixed point.
. . . . . . . . . ( end of section Footnotes) <<Contents | Index>>
. . . . . . . . . ( end of section Notes on the Fixed Point Theory of Denotations) <<Contents | Index>>