Skip to main contentCal State San Bernardino
>> [CNS] >> [Comp Sci Dept] >> [R J Botting] >> [CS546/646] >> 06 [Text Version]
[About] [Contact] [Grades] [Projects] [Submit] [Schedule] [Syllabus] [Search ]
Session: [01] [02] [03] [04] [05] (06) [07] [08] [09] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20]
Thu Jun 8 13:02:51 PDT 2006

Contents


    Programming and Extending Turing Machines

      Context

      Date Meet'g Study(2 pts) Bring(5 pts) Topic(5 pts) Notes
      Previous 5 Chapter 8 sections 8.1, 8.2 Ex Turing Machines TM
      Today 6 Chapter 8 sections 8.3, 8.4 Ex Programming TM
      Fri Apr 21 - - - Last day to drop
      Next 7 Chapter 8 sections 8.5, 8.6 Ex Restricted TM

      8.3 Programming Techniques for Turing Machines 329

        8.3.1 Storage in the State 330

        Ch 8.3.1 pp 330-331 -- Storage in the State

        I\'m terribly confused on how this process works. Can you detail why the transition function is defined as it is?

        A Turing machine has a finite number of states in its CPU. However, the number of states is not always small (like 6). Like a Pentium chip we can store values in it -- as long as there are only finite number of them. For example all real computers have registers but there are only a fixed number of them, AND each register can only hold one of a fixed (and finite) number of bits.

        Example: a simple computer with control unit holding a PC with a 16bit address + arithmetic unit holding one double length 64 bit Arithmetic Register. This has stores 40 bits of information plus a small internal control state: {fetch, decode, compute, store, next} perhaps. So we have a finite state machine with 2^80 states!

        It is sad that we don't have a high level language for TMs then we could write statements like

      1. x=tape.head; perhaps.

        Suppose that we wanted to code (in C++) a TM with states 'q0', 'q1',... plus storage 'x' we might write code like this

         q0: b=tape.head; tape.move_right(); goto q1;
         q1: if(tape.head()==B) return FAILS;
             if(tape.head()==x) tape.move_right(); goto q1;
         ...

        Ch 8.4 pp 830-831 -- Example 8.6

        I found it very confusing, the idea of storage in states. Could you go over this kind of problem? And is it beneficial to convey the transition function in table format, as it was done in sections 8.1 and 8.2, for this example and what would the table be in this case?

        A typical transition table when data is being stored has entries like this
        StateReadNextWriteMove
        (q0,B)1(q1,1)1R
        (q0,B)0(q1,0)0R
        We might interpret this as part of a step that stores the symbol read from the tape in the CPU. Note: the book uses "[]" where I use "()".

        Tabulating the 6 states >< 3 symbols on page 331 is not difficult and may help.

        But suppose you have a CPU that has 6 binary flags stored in it plus 3 control states. Now you have 3*64 states to write down...

        Tabulating all possible states is not very helpful for machines with structured or compound states.... unless you allow variables and conditions to appear in the state descriptions AND expressions in the body of the table.

        The other thing missing from the table is the documentation that explains what each transition means.

        8.3.2 Multiple Tracks 331

        Ch 8 pp 331 -- Multitrack Turing Machine

        Can you explain multiple track idea and its relation to the Turing Machine? And maybe show one example in class?

        Key idea: \Gamma is the Cartesian product of a finite number of finite sets. (Cartesian product works like a struct in C/C++)

        For example: computer tape storage is multi-track tape with something like 8 or 9 bits in each cell.

        You can recognise a multi-track machine by looking at the transitions because each will have tuples as the read and write symbols.

         		(*,1)/(B,0)->
         		(*,0)/(B,1)->
         		(B,0)/(B,0)->
         		(B,1)/(B,1)->
        Suppose that all of the above are on a loop from q to q in a diagram then we have these entries in the table:
        StateNextReadWriteMove
        q(*,1)q(B,0)R
        q(*,0)q(B,1)R
        q(B,1)q(B,1)R
        q(B,0)q(B,0)R
        Both have the effect of taking a marked (*) bits and flipping them.

        Exercise: complete the \delta descriptions that fit with the above:

        • ...
        • \delta(q, (*,i)) = (___________,_________, R),
        • \delta(q, (_____,i)) = (B,i, R),
        • ...

        Note the book is using "[]" in place of '()' for tuples. Other books use '<>'!

        Error on page 332

        In the description of \Sigma does not mention the encoding of "c" as [B,c].

        Ch 8.3.2 pp 331-333 -- Multiple Tracks

        In example 8.7, i had problem understanding the transition functions, and the duty or assignment for each part in the function.

        Can you explain, what each part of the transition function is supposed to mean or do?

        8.3.3 Subroutines 333

        Ch 8 pp 335 -- Subroutines

        Would you please explain a little bit more on figure 8.14 and figure 8.15, on how the subroutine Copy works?

        8.3.4 Exercises for Section 8.3 334

      8.4 Extensions to the Basic Turing Machine 336

        8.4.1 Multitape Turing Machines 336

        Note: try not to confuse a multi-tape machine with a multi-track single tape machine!

        You can recognize a multitape machine by its table of transitions. Here is a piece of a 2-tape machine
        StateSymbol1Symbol2New1New2Move1Move2
        q0abcdLR
        ...
        Which means that in state q0, if the first head sees an "a", and the second head sees a "b" then it write "c" on tape 1 and moves that head left, and it also writes "d" on tape 2 and moves that head right. Notice that we have two symbols that are read, two are written and their are two moves.

        The transition would be something like:

         		ab/cd<-->

        Ch 8 pp 335 -- Figures 8.15 8.16 8.17

        Could you please explain the two diagrams step by step. Figure 8.15 and figure 8.16 or 8.17

        Ch 8.4.1 pp 336 -- Multitape Turing Machines

        Why use multi-tape Turing Machines if they are equivalent to single tape Turing Machines?

        They are easier to program.... and run faster.

        8.4.2 Equivalence of One-Tape and Multitape TM's 337

        8.4.3 Running Time and the Many-Tapes-to-One Construction 339

        Ch 8.4.3 pp 339-340 -- Introduction to Turing Machines

        Could you go over how Many-Tapes-to-One Construction is used in the time complexity of a TM? This wasn't clear to me in the chapter.

        8.4.4 Nondeterministic Turing Machines 340

        Ch 8 pp 340 -- Nondeterministic Turing Machine

        Can you clarify the difference between a nondeterministic Turing Machine and a deterministic one?

        Ch 8.4 -- NTM

        How does a nondeterministic Turing machine differ from a deterministic Turing machine?

        Group discussion: what makes a TM non-deterministic?

          The machine has choices in some states and with some symbols: there is at least one pair of state><head with two (or more) possible transitions. This spawns extra (hypothetical) threads. The diagram has many arrows leaving a state with the same symbol on it.

        As a result we can't trace a sequence of steps like this

         		[q0]000 |- 1[q0]00 |- 11[q0]0 |-
        because the TM can branch into several possible paths "at once".

        One way to handle this is to draw a tree. This can take a lot of paper but works well for humans. For example, here is a tree of IDs generated by the NTM in the execises for this section:

        [Tree of IDs from NTM in exercises]

        Another approach is embedded in the Proof of Theore, 8.11.

        Ch 8.4 pp 341 -- Nondeterministic Turing Machines

        Please explain the proof of Theorem 8.11 on Nondeterministic Turning Machines?

        As an introduction to the proof it is worth doing exercise 8.4.2 using a list of the IDs reachable from the initial ID. For each of these in turn, add the set of next IDs at the end of the list and cross out the state they came from from the beginning of the list.

        In class we traced several the IDs that could come from the NTM in the dreaded Exercise 8.4.4.

        8.4.4 Theorem 8.11

        This states we can always construct a deterministic Turing machine for a NTM. The example shown also states that the deterministic TM may take exponentially more time after being constructed from an NTM. Is that always the case?

        Excellent question. We spend several weeks looking at the question and what the answer might be.

        8.4.5 Exercises for Section 8.4 342

        Ch 8.4 pp 342 -- Exercise 8.4.2

        In Exercise 8.4.2, the indicated accepting state is "q-2". Do they just mean state q2 and the dash was a typo or is there some extra meaning to "q-2"?

        The dash is missing in my copy. I think it is a typo.

        ch 8.4.5 Exercise 8.4.4

        Did anyone get anywhere with this?

        2006-04-20 Thu Apr 20 13:04 Latest on Exercise 8.4.4

        Having traced a dozen cases and drawn some trees and and with your help and advice.... my best guess for the language accepted is something like 0 0* ( 1 0 0* )* -- all strings that start with a zero and never have two consecutive 1's.

        Ch 8.4.10 pp 345 -- Finiteness

        Based on the boxed note on p339, an infinite number of accepted inputs can exist for a given turing machine and therefore the set of possible positions for the head is infinite. But, for this problem, the only approach I can see involves mapping the positions of the 2-dimensional tape into the positions of the 1-dimensional tape (exploiting set cardinality), and the boxed note invalidates that approach. So, I have no idea how to approach this problem. Could you go over how to do this problem?

        This is a "!!" problem and so beyond the scope of this class. If we have time, I can sketch out my way to tackle it in class or if no time in my office hour. I recall Michael Arbib proving this in the late 1960's.

        Ch 8.3+4 -- Class 06 deadline

        Fewer points if delivered after this message.

      Modification to Project Specification

      I've got a determinstic TM emulator. There are several on the web. What I'd like is to handle nondeterministic TMs. So to earn full credit for the project the final version should handle non-deterministic TMs by using a queue and breadth-first search.

    . . . . . . . . . ( end of section Programming and Extending Turing Machines) <<Contents | End>>

    Next -- Restricting Turing Machines OR what really counts

    See [ 07.html ] and look for the hint on the exercises.

    Standard Definitions

  1. FSA::="Finite State Acceptor/Automata", a theoretical machine with a finite set of states.
  2. ND::="Non-deterministic", a machine that can try out many possibilities and always guesses the right choice. Note: For a given problem a ND solution is simpler and easier to understand than a deterministic one. We study ND machines to discover how to use them as high-level designs for more complicated deterministic machines. Plus it's fun.
  3. PDA::="Push down Automata", a machine with a finite control and a single stack for remembering intermediate data. Kind of like half a TM.
  4. RJB::=The author of this document, [ ../index.html ]
  5. |-RJB="Richard J Botting, Comp Sci Dept, CSUSB".
  6. Schedule::= See http://www.csci.csusb.edu/dick/cs546/schedule.html.
  7. Search::= See http://www.csci.csusb.edu/dick/cs546/lookup.php.
  8. Syllabus::= See http://www.csci.csusb.edu/dick/cs546/syllabus.html, and also see [ syllabus.html ] (my generic syllabus).
  9. TBA::="To Be Announced".
  10. TM::="Turing Machine".
  11. NTM::="Nondeterministic Turing Machine".
  12. nondeterministic::="A machine that can make choices as it runs and accepts an input iff any sequence of choices leads to acceptance."
  13. DTM::="Deterministic Turing Machine".
  14. runtime::=run_time,
  15. run_time::="The number of steps a TM takes before it halts when start on a given input".
  16. complexity::="The largest run_time for any word with a given length" , usually expressed as an assymptotic (Big-O) formula.
  17. P::@problem=class of problems that a DTM can compute in polynomial_time.
  18. NP::@problem=class of problems that a NTM can compute in polynomial_time.
  19. polynomial_time::=An algorithm/TM is said to halt in polynomial time if the number of steps given n items of data is less than O(n^c) for some constant c.

    From Logic

  20. LOGIC::= See http://www.csci.csusb.edu/dick/maths/logic_25_Proofs.html


  21. (LOGIC)|- (ei): Existential instantiation -- if P is true of something then we can substitute a new variable for it.
  22. (LOGIC)|- (eg): existential generalization -- if P is true of this thing then it is true of something!
  23. (LOGIC)|- (given): a proposition that is assumed as part of a Let...End Let argument. Typically if A and B and C then D starts with assuming A,B, and C are given. They are called the hypotheses.
  24. (LOGIC)|- (goal): a proposition that we wish to prove. Typically the part after the then in an if...then... result. This is called the conclusion.
  25. (LOGIC)|- (def): By definition of a term defined above.
  26. (LOGIC)|- (algebra): Either by using well known algebraic results, or by use a series of algebraic steps.
  27. (LOGIC)|- (RAA): Reduce to absurdity. The end of a Let/Po/Case/Net that establishes the negation of the last set of assumptions/hypotheses.

End