The previous iteration is in [ design99.html ] (1999).
This is short for Purposes, Qualities, Realities, Technical things, and Systems.
Here is a formal description:
Normally we have an existing system to start from:
Notice that (in theory) P can be expressed as the intersection of a number of sets of systems meeting partial requirements --
Normally the current system s0 satisfies some of the purposes of the system, but not all of them:
The Qualities (Q) express the non-functional requirements on the system. Typically they are expressed in terms of "ilities" and then need refinement into metrics and tests that measure the quality of the system. Also note that a system may have many significant qualities -- speed, cost, reliability, and security -- for example.
Notice that since this is a model of systems rather than a model of designs, specifications, and implementations it is not natural to make a model of the relation (1st) implements (2nd). Instead the set of systems that are in P and/or R may be described by a separate universe of documentation and the (1st) implements (2nd) connects the systems with their documentation. Thus a hierarchy of refinements may or may not exist between documentation, but not between systems. Loe Freijs has presented an interesting theory of this relation and the structure of designs [Freijs93].
Notice that much of the published information on software development
is about the
(T). These define the limitations on what the software
development team can do to produce a better system. Here are some of
the limitations that are subsumed under the T factors:
My focus here is on the essence of the engineering ethic: making something more effective.
So if t:T, then t(s0) describes the possible outcomes of applying the technique of technology to the initial system.
t1 and t2 are two T techniques then
So we can
assume that we have a Monoid (definition linked below) of T's
By Knaster-Tarski Theorem
Assume that costs and qualities accumulate as successive changes are made.
A triple in S><T><S can model a particular change to the system, and can have a value associated with it by:
Now, for example if cost is the critical value then we might have s1 better than s2 if the cost is less(<). So, for any relation on the Real numbers, we have an induced relation between systems. For example '<' is defined on the Reals, so < mod f is holds between systems s1 and s2 whenever f(v((s1))<f(v(s2)):
The Traveling Salesperson Problem demonstrates that finding an optimal design is an NP-complete problem. The Busy Beaver problem for Turing Machines is an optimization problem as well (longest time), so it is clear that some Optimization problems are unsolvable.
If we can treat S as a topological space with
A solution is stable with respect to a given quality if all changes can make it worse.
By analogy with Pareto's economic analysis there is probably a system from which all changes decrease some quality:
However some equilibria are not optimal.
Another attempt to optimize complex systems leads to the paradoxical trap of sub-optimization( from Russell Ackov's work). In sub-optimization the problem is decomposed into parts and each part optimized separately. Although this is often successful there is no reason to expect the whole to be optimal as well. ...
A design is structurally stable with respect to a factor if small changes in that factor leave the structure unchanged. Values of factors where this is not so are said to be on a 'catastrophe'[Thom, ...].