>> [CNS] >> [Comp Sci Dept] >> [R J Botting] >> [CS546/646] >> 12 [Text Version]
Session: [01] [02] [03] [04] [05] [06] [07] [08] [09] [10] [11] (12) [13] [14] [15] [16] [17] [18] [19] [20]
Thu Jun 8 13:02:57 PDT 2006

12

Schedule

 Date Meet'g Study(2 pts) Bring(5 pts) Topic(5 pts) Notes Mon May 8 11 Project2 (20 pts) - Turing Machines Wed May 10 12 Chapter 9 section 9.3 Ex Undecidability & TM Mon May 15 13 Chapter 9 sections 9.4, 9.5.1, 9.6 Ex

9.3 Undecidable Problems About Turing Machines 383

A poem by Piet Hien

1. A problem worthy of attack
2. Proves it's worth
3. By hitting back.
Recall:
1. An algorithm is a TM that always halts.
2. A recursive language has an algorithm that determines if a string is in the language or not.
3. A recursively enumerable(RE) language has a TM that determines if a string is in the language, but if it is not in the language, or may halt or it may not stop.
4. If a language is non-RE then there is no TM that always accepts the right strings... sometimes it fails on acceptable strings. Failing includes not halting.

9.3.1 Reductions 383

Here is an analogy:
We can reduce the problem run a Java program to run the Java Virtual machine because we can write a compiler from Java into bytecode and the JVM runs bytecode.

We can take this reduction a little further because Sun provides a Java Compiler that is avaliable as a bytecode application... so if we can run the JVM then we can compile any program and run the result.

So we can conclude: if their exists a JVM then we can run Java.

We can also conclude: if we can't run Java then we can't have a JVM.

Some reductions are clever in this chapter. Before we reduced problem P1 to Problem P2 by showing how to construct P1 from P2 by adding extra algorithms or machines A and B to P2.

The idea was that if if there is a solution to P2 then there must be a solution to P1 -- but we knew that P1 could not be solved.

In this chapter we do something subtle. We design an extra machine or algorithm A that takes in the description of a machine and its data and produces a new machine description (and its data perhaps). This is then fed to a hypothetical machine that solves P2 -- as data, and as a result P2 solves P1 on the original machine.

Again we argue that if we have an algorithm for P2 then we would have an algorithm for P1. Or that IF we have a TM that solves P2 (but it sometimes fails to halt), THEN we would have a similar TM for P1. In either case we have already shown that algorithms/machines for P1 don't exist... so they don't exit for P2.

This is subtle. The simpler reduction was a hardware technique: wire up a solution to P1 to other algorithm/machines and solve P2. Now we have a software technique. We construct a compiler that lets us convert P1 problems into P2 problems. So if we can solve P2, we must have a solver for P1 too... . So if P1 can not be solved, neither can P2.

Theorem 9.7 Reductions

If there is a reduction from P1 to P2 then
1. If P1 is undecidable then so is P2,
2. If P2 is non-RE then so is P2

The structure of the proof: prove each part in turn. Each part has form if hyp then Conc and is proved by assuming hyp and using it to prove the conc.

Structure
Let

1. There be a reduction from P1 to P2
Let
1. P1 be undecidable,
2. ...

(Close Let )

Let
1. P1 be non-RE,
2. ...
3. P2 is non-RE

(Close Let )

(Close Let )

Ch 9.3.1 pp 364 -- Theorem 9.7

Can you give an example of this theorem, so it will make more sense

Example 1. If you have a program that would can be used to solve the Halting Problem -- then you know that your program doesn't always work.

Example 2.

Theorem 9.8 L[ne] is RE

Use a NTM to guess a string that fits the input machine and then use the universal TM U to verify the guess.

Theorem 9.9 L[ne] is not Recursive

If we had a machine that halts having decided that machine M accepts something.... then we can construct U from it. But U is not recursive. So our L[ne] machine is also not recursive.

Ch 9.3 pp 386 -- Theorem 9.9

Please explain Theorm 9.9. Is w always constant?

The idea is to construct an algorithm that will take any (M,w) and make a new machine M from them.

So w is not a constant but a parameter or argument of the algorithm. It could be any string/word, and each time our algorithm will produce a different M'. In C/C++/Java it would be like this:

`		Machine algorithm(Machine M, Input w){ ....; return M';}`

9.3.3 Rice's Theorem and Properties of the RE Languages 387

[ Rice's_theorem ]

Ch 9.3 pp 388 -- Could you please explain theorem 9.11

Theorem 9.11 means that just about any question you ask about the outputs of a computer can not be computed by a computer, OR it is a question where the answer is always the same -- a trivial question.

Ch 9.3 pp 383 -- can you give an example of a nontrivial property ?

• Does this TM machine accept anything?
• Does this TM accept 17 strings, no more and no less?
• L_e
• L_ne
• Does this TM print a rude word on its tape?
With a trivial property the answer does not depend on the TM at all. The answer is always the same.

Notice

When we talk about a property we are not talking about one language but a collection of languages. For example: not the set of strings with an even number of 1's, but the languages that have an even number of strings!

We decide a property P if we can construct a TM that reads in the description of a TM and decides if the input TM meets the property.

When we say P is an undecidable property we mean the language of codes for TM M that satisfy P is undecidable.

Also note: the theorem involves the RE languages AND undecidability.

This is a powerful result and so needs a complex and subtle proof.

Ch 9 pp 388 -- Undecidability

Can you explain Theorem 9.11. I do not understand why Every nontrivial property of the RE languages is undecidable.

This is not an obvious result hence the complex proof.

Ch 9.3.3 pp 388-389 -- Undecidability

Could you go over Theorem 9.11, I am having trouble understanding the given proof?

Ch 9.3 pp 388-389 -- Proof of Theorem 9.11

The book says the proof of Theorem 9.11 is a proof by construction, but it looks (from #3 on page 389) like it is a proof by contradiction. Are they constructing a proof by construction out of a proof by contradiction, or is there something I am not seeing?

Ch 9.3 pp 389 -- Rice\'s Theorem

What is the M-sub-L mentioned in the reduction proof for Rice\'s theorem?

I found it difficult to estract the structure of this proof. Here is my third attempt. I'm using the notation from my [ ../maths/logic_25_Proofs.html ] page.

Outline of proof
Let

1. (let)|- (1): P be a property of RE languages and non-trivial.
2. Lp::=language of TM codes M such that M decides P.
3. (1)|-P is a set of RE languages that is neither empty nor all of them,
4. (set theory)|-{} is in P or {} is not in P.
Case

1. (let)|- (1.1): {} is not in P.
2. (1.1)|-some L!={} is in P.
3. (1)|-L is RE,
4. (above)|-Some TM Ml accepts L ,
5. (ei)|-Ml is a TM that accepts L.
6. Reduce Lu to Lp -- showing an algorithm that given (M,w), uses Ml and M, to construct a machine M' that accepts (M,w) if and only if M accepts w.
7. If Lp is decidable then when applied to M' it decides Lu.
8. but Lu is undecidable,
9. (above)|-Lp is undecidable,
10. (above)|-P is an undecidable property

(End of Case)
Case

1. (let)|- (2): {} is in P
2. Consider the complement of P and use previous case
3. ...
4. (above)|-P is an undecidable property

(Close Case )

5. (above)|-in either case P is an undecidable property.

(Close Let )

1. (above)|-If P is a non-trivial property of RE languages then P is an undecidable property.
2. (above)|-nontrivial==>undecidable!

Notes:

3. Lu::=The universal Language of all (M,w) such that M accepts w, [ 10.html#9.2.3 The Universal Language 377 ]
(ei): Existential Instantiation lets us deduce from "some x(F(x))" that "F(v)" as long as "v" is a free variable. [ ../maths/logic_25_Proofs.html#ei_explained ]

Alternate Proofs of Rice's Theorem

[ Rice.html ]

Ch 7, ch9 pp 390-390 -- Example 9.12

This section re-iterates that Rice's theorem shows the undecidability of any problems where acceptance of a language for a particular TM is our goal. But questions about the states of a Turing machine might be decidable. Are any questions that involve a numeric component, like in Example 9.12, decidable?

Yes. Example 9.12 (does this machine have 5 states?) is easy to decide. It is like the question: does this C++ program "ttt.cpp" have 200 lines?

` 		wc -l ttt.cpp`

Contrast the above question with the question: Does ttt.cpp have any bugs?

Ch 9.3.4 pp 390 -- Example 9.12

"If a TM makes five moves, then it does so looking only at the nine cells of its tape surrounding its initial head position." Why nine?

Because that it can not move more than 5 steps in either direction from the starting point. I figure 4 to the left + 4 to the right + the starting cell.

Exercise 9.3.4 (c) -- Context free languages

Not part of this course.

Exercise 9.3.7 (b) and (c) -- Hint

The inputs have either 2 or 3 machines. If you choose one or two of these cleverly then the resulting machine does something impossible.

Exercise 9.3.8 -- Not in this course

. . . . . . . . . ( end of section 9.3 Undecidable Problems About Turing Machines 383) <<Contents | End>>

. . . . . . . . . ( end of section 12) <<Contents | End>>

Next

Do not study sections 9.5.2 or 9.5.3 unless you really want to. We are not getting into grammar theory in this class. See [ 13.html ] for details.

Notes

(Ex): Do as many of the relevant exercises as you have time for. You may work in a team of upto 4 students and hand in one joint solution. Bring to class one written solution to an exercise. This must not be a solution to an exercise marked with an asterisk(*) to earn full credit. One of the authors will be invited to present the solution to the class -- failure will loose points. Students taking CS646 must hand in the solution to an exercise marked with an exclamation mark(!) to earn full credit.
(Study): Read & think about the assigned items. Submit one question by selecting the submit button at the top of the web page. To earn complete credit you need to do this at least 90 minutes before the start of class. Hints. Read each section twice -- once the easy bits and then the tough bits. Use a scratch pad and pencil as you read.
(Topic): To earn all the possible points you must: turn up on time, not leave early, present work when called upon, and take part in discussions.

Standard Definitions

1. FSA::="Finite State Acceptor/Automata", a theoretical machine with a finite set of states.
2. ND::="Non-deterministic", a machine that can try out many possibilities and always guesses the right choice. Note: For a given problem a ND solution is simpler and easier to understand than a deterministic one. We study ND machines to discover how to use them as high-level designs for more complicated deterministic machines. Plus it's fun.
3. PDA::="Push down Automata", a machine with a finite control and a single stack for remembering intermediate data. Kind of like half a TM.
4. RJB::=The author of this document, [ ../index.html ]
5. |-RJB="Richard J Botting, Comp Sci Dept, CSUSB".
6. Schedule::= See http://www.csci.csusb.edu/dick/cs546/schedule.html.
7. Search::= See http://www.csci.csusb.edu/dick/cs546/lookup.php.
8. Syllabus::= See http://www.csci.csusb.edu/dick/cs546/syllabus.html, and also see [ syllabus.html ] (my generic syllabus).
9. TBA::="To Be Announced".
10. TM::="Turing Machine".
11. NTM::="Nondeterministic Turing Machine".
12. nondeterministic::="A machine that can make choices as it runs and accepts an input iff any sequence of choices leads to acceptance."
13. DTM::="Deterministic Turing Machine".
14. runtime::=run_time,
15. run_time::="The number of steps a TM takes before it halts when start on a given input".
16. complexity::="The largest run_time for any word with a given length" , usually expressed as an assymptotic (Big-O) formula.
17. P::@problem=class of problems that a DTM can compute in polynomial_time.
18. NP::@problem=class of problems that a NTM can compute in polynomial_time.
19. polynomial_time::=An algorithm/TM is said to halt in polynomial time if the number of steps given n items of data is less than O(n^c) for some constant c.

From Logic

20. LOGIC::= See http://www.csci.csusb.edu/dick/maths/logic_25_Proofs.html

21. (LOGIC)|- (ei): Existential instantiation -- if P is true of something then we can substitute a new variable for it.
22. (LOGIC)|- (eg): existential generalization -- if P is true of this thing then it is true of something!
23. (LOGIC)|- (given): a proposition that is assumed as part of a Let...End Let argument. Typically if A and B and C then D starts with assuming A,B, and C are given. They are called the hypotheses.
24. (LOGIC)|- (goal): a proposition that we wish to prove. Typically the part after the then in an if...then... result. This is called the conclusion.
25. (LOGIC)|- (def): By definition of a term defined above.
26. (LOGIC)|- (algebra): Either by using well known algebraic results, or by use a series of algebraic steps.
27. (LOGIC)|- (RAA): Reduce to absurdity. The end of a Let/Po/Case/Net that establishes the negation of the last set of assumptions/hypotheses.