>> [CNS] >> [Comp Sci Dept] >> [R J Botting] >> [CS546/646] >> 07 [Text Version]
Session: [01] [02] [03] [04] [05] [06] (07) [08] [09] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20]
Thu Jun 8 13:02:52 PDT 2006

# Class Meeting 07 on Restricted Turing Machines

## Context

 Date Meet'g Study(2 pts) Bring(5 pts) Topic(5 pts) Notes Previous 6 Chapter 8 sections 8.3, 8.4 Ex Programming TM Fri Apr 21 - - - Last day to drop Today 7 Chapter 8 sections 8.5, 8.6 Ex Restricted TM Next 8 Chapter 8 section 8.7+Minsky Ex The Halting Problem Start Project

(Minsky): Study my six page handout from Minsky's 1964 "Computation: Finite and Infinite Machines".

(Project): Working in a team of 3 or 4 people design, code, and test a simple Turing Machine simulator.

• Notes
• You may use any language that can be demonstrated in class.
• You may choose any kind of user interface you like.
• Your TM simulator does not have to have infinite memory capacity like a real TM.
• Do the simplest thing that can possibly work.
• Consult with me in my office hours.
1. Process
1. Start by thinking about the design and developing tests for your code...
2. First deadline: bring a progress report to class and present it. Grading: pass/fail. Any running set of tests will pass.
3. Second deadline: bring a report on the final status, present it, and hand in a hard copy for grading.

[ projects.html ]

## 8.5 Restricted Turing Machines 345

### Ch 8.5 pp 352-353 -- Recursively Enumerable Languages

What is a recursively enumerable language?

The word dates back to the older theory of Recursive Functions when a recursively enumerable language was a set of strings that could be generated by a recursive procedure.

This name is now given to languages that are accepted by some Turing Machine -- even if rejection some times happens by the machine never coming to a conclusion -- terminating.

A recursive language was a language could be recognized by a recursive function. These days -- by a TM that either halts rejecting the language or accepts the language(and halts).

One of the key results of the old recursive function theory (we'll research this later...) was that recursive definitions that did not have to terminate on all inputs were more powerful than those that always halted.

### Ch 8 pp 346 How do we simulate a semi-infinite tape into a two way infinite tape

I call this trick/technique folding the table. There are two tracks: Upper and Lower and we have a stored value:{U,L} indicating which is which. The lower track is backwards!

If we had a two-side tape like this
tape...xyzabc
then an equivalent one sided TM is like this:
upperabc
lower*zyx
storeU

If we had a two-side tape like this
tape...xyzabc
then an equivalent one sided TM is like this:
upperabc
lower*zyx
storeL

Also note the trick of pseudo-blanks.

### 8.5.2 Multistack Machines 348

Two stacks make a TM.

### Ch 8.5.2 pp 349 -- Multistack Machines

What's the difference between B the blank symbol and \$ the endmarker?

A TM doesn't need an "end of input" symbol because it's input has no end..... (but is mosly blank).. The \$ end marker is only found after the end of the "real" data input to the acceptor. A blank might appear in the middle of TM data.

### Ch 8.5 pp pg349 -- Multistack machines

How exactly does a multistack machine work? I understand it is like a PDA, but where does it exactly differ?

It can have more than one stack to store intermediate results. Plus it has operations that can pop data off one stack and push it onto another stack.

### Ch 8.5.2 pp 348 - 349 -- Multistack Machines Theorem 8.13

Could you explain Theorem 8.13 in more detail, and give an example using the theorem.

In class I ran some steps in which two stacks simulated a TM shown earlier in the book.

### Ch 8.5.3 pp 351 -- Counter Machines

Can you demonstrate counter machines using pictures, to give a better understanding?

In class I ran one the exercise solutions. Tracing the two counters
Inputij
abc\$00
bc\$11
...

Think of counter machines as programs with unsigned long int variables. It only uses statements like this:

` 		i++;`
` 		if(i!=0)i--; else exit(FAILURE);`
Plus the ability to test two conditions for each variable:
` 		i==0`
` 		i!=0`

But remember these variables can get as big as we need.

### 8.5.3 And C/C++

3 or 4 counters make a TM.
• The machines control unit is a program.
• It can be coded using labels for states and gotos.
• We only need 3 very long ints... plus
• operator ++
• operator --
• operator ==0

So here is a universal C program:-(

` int main(....){`
` VeryLongInt i,j,k,l;`
` q0://initial state`
` ...`

### Ch 8.5.3 pp 351 -- Counters

What happens if a program does try to subtract from a counter that is at 0?

I think that the acceptor halts and rejects the input string.

### 8.5.4 The Power of Counter Machines 352

You don't need to learn about CFL's for this course. They will not be mentioned in the final.

The trick of using the prime numbers to encode n-tuples into a number is called Godel numbering.

### 8.5.5 Exercises for Section 8.5 354

There is a misprint in part b that I found on the web site for the book. The correct language is:
1. { 0^n 1^m | m>=n>=1 }

Hint.

1. The solution for a is not on the web -- you can get credit for trying it.
2. Don't attempt the !! exercises.
3. Look at the solution to * c before you attempt b.

### Ch 8.5.1 c pp 354 -- Counter Machines

Would you please explain how to use two counters to describe the language represented by problem 8.5.1 c?

I used my solution to this (based on the web solution) as my simulation of a counter machine, above.

## 8.6 Turing Machines and Computers 355

### Ch 8.7 pp 362 -- Words of Fixed Maximum Length

Reading the book's explanation on why it allows words to grow by 1 bit at a time, I'm a little confused by why the "more liberal" approach simplifies their proof. Does forcing words to be a fixed maximum length actually reduce the asymptotic complexity of the simulation of a computer by a Turing machine, or is the complexity the same as their "more liberal" approach?

If the words don't grow then you have a finite state machine. Then all you have to worry about is the secondary storage: disks and tapes.

But the time the simulator requires depends on how much data is stored: TMs don't have random access. So the book works out a way that allows for this to grow -- but not too fast. In effect they are allowing for the fact that real computers take longer to multiply and divide numbers than they do to add, subtract, copy, etc. numbers.

### Ch 8 pp 359 -- TM and simulation of a computer

Can you go over figure 8.22 and explain how a TM simulates a computer?

I did a lot of hand waving...

### Ch 8.6.2 pp 356-361 -- Are there things a computer can do that a Turing machine cant ?

We had a fun debate: My summary: a TM is not a real machine so has none of the defects of real computers: breakdowns, abuse, pornography, upgrades, ... even tho' we can simulate a lot of these by adding a state. For example [blue screen of death].

### Ch 5, ch8 pp 361-362 -- Running times of computers and TM

I got a bit confused with this section. Is it possible for you to go over the running time differences?

Again: hand waving.

Fewer points if delivered after this message

. . . . . . . . . ( end of section Class Meeting 07 on Restricted Turing Machines) <<Contents | End>>

[ 08.html ]

# Standard Definitions

1. FSA::="Finite State Acceptor/Automata", a theoretical machine with a finite set of states.
2. ND::="Non-deterministic", a machine that can try out many possibilities and always guesses the right choice. Note: For a given problem a ND solution is simpler and easier to understand than a deterministic one. We study ND machines to discover how to use them as high-level designs for more complicated deterministic machines. Plus it's fun.
3. PDA::="Push down Automata", a machine with a finite control and a single stack for remembering intermediate data. Kind of like half a TM.
4. RJB::=The author of this document, [ ../index.html ]
5. |-RJB="Richard J Botting, Comp Sci Dept, CSUSB".
6. Schedule::= See http://www.csci.csusb.edu/dick/cs546/schedule.html.
7. Search::= See http://www.csci.csusb.edu/dick/cs546/lookup.php.
8. Syllabus::= See http://www.csci.csusb.edu/dick/cs546/syllabus.html, and also see [ syllabus.html ] (my generic syllabus).
9. TBA::="To Be Announced".
10. TM::="Turing Machine".
11. NTM::="Nondeterministic Turing Machine".
12. nondeterministic::="A machine that can make choices as it runs and accepts an input iff any sequence of choices leads to acceptance."
13. DTM::="Deterministic Turing Machine".
14. runtime::=run_time,
15. run_time::="The number of steps a TM takes before it halts when start on a given input".
16. complexity::="The largest run_time for any word with a given length" , usually expressed as an assymptotic (Big-O) formula.
17. P::@problem=class of problems that a DTM can compute in polynomial_time.
18. NP::@problem=class of problems that a NTM can compute in polynomial_time.
19. polynomial_time::=An algorithm/TM is said to halt in polynomial time if the number of steps given n items of data is less than O(n^c) for some constant c.

# From Logic

20. LOGIC::= See http://www.csci.csusb.edu/dick/maths/logic_25_Proofs.html

21. (LOGIC)|- (ei): Existential instantiation -- if P is true of something then we can substitute a new variable for it.
22. (LOGIC)|- (eg): existential generalization -- if P is true of this thing then it is true of something!
23. (LOGIC)|- (given): a proposition that is assumed as part of a Let...End Let argument. Typically if A and B and C then D starts with assuming A,B, and C are given. They are called the hypotheses.
24. (LOGIC)|- (goal): a proposition that we wish to prove. Typically the part after the then in an if...then... result. This is called the conclusion.
25. (LOGIC)|- (def): By definition of a term defined above.
26. (LOGIC)|- (algebra): Either by using well known algebraic results, or by use a series of algebraic steps.
27. (LOGIC)|- (RAA): Reduce to absurdity. The end of a Let/Po/Case/Net that establishes the negation of the last set of assumptions/hypotheses.