Why are there are so many complexity classes.
Some seem very similar to others. Could you please explain again why there are so many complexity classes?
Which of the problem classes are the most important to remember?
I think the probabalilstic ones RP and ZPP are probably least valuable and
R, RE, P, NP, NP-complete the most critical.
What might we expect for final questions on chapters 1 and 2?
Theory based, calculation based, both?
Not much.... Chapter 1 is use thruout the rest of the course and final...
Chapter 2: you need to know the definitions of the various kinds of
automata and their languages -- and how they differ from TM's and their
languages. Also you may have to use the definitions in an informal proof.
But the real focus in this class is on TMs.
In Q.1 of the Final Exam Review Sheet, TM and Languages, what is "Languages" referring to ?
In complexity theory the main purpose of machines is to recognise languages.
So for each type of machine a language is defined -- typically by the
machine entering a particular set of states.
[click here if you can fill this hole]
Could you give a couple simple examples of formally written 'Induction' and 'Reduction' proofs?
First a distinction: similar names but different in all other respects.
is used to get a handle on infinite bu countable situations: numbers,
strings, trees, .... Things that grow on piece at a time. The plan is
to establish the truth of a few simple cases and then demonstrate that as we
grow the complex objects the property remains true.
Example of Induction
All my Lego blocks are Green. So all I can construct are green objects.
is only used in the theory of computation. You already know that
some problem P1 is difficult/impossible/hard and want to show that a new
problem P2 is at least as bad. The plan od attack is to show
that if we solve P2 then it is easy to solve P1.... but P1 is hard/impossible
so must P2 be.
Example of reduction -- this comes from the paper I'm reviewing above:
Reducing the existence problem for an agent to
the construction problem for an agent.
The problem of constructing an ajent must be harder then checking for
an agent existence, because construction prove existence.
Can you go over an example of a NP-Complete problem?
TSP? Vertex Cover?
Can you go over the reduction of NP-Complete problems?
- Minesweeper is NP
- Any instance of SAT can be expressed as a Minesweeper game!
- Encode boolean variables as a pair of unknpown squares surrounded by 1s.
- Encode 'wires' connecting different occurences of variables.
- Encode a NOT gate...
- Encode an AND gate
- Therefore can encode any boolean expression.
- Choosing the correct squares in each pair expresses SAT.
Could you please go over the language L[d] again? From my understanding it is the set of strings that won't make valid TMs. However, there are several proofs that use a TM for L[d]?
L[d] is the "Diagonalization Language" defined in section 9.1.3 (pp370-372)
which plays aa critical part in section 9.1.4 and the proof of 9.2.
[ 10.html#9.1.3 The Diagonalization Language 370 ]
[ 10.html#L_d ]
It is not the set of strings that don't make TMs.... but something more
subtle. It is the language made up strings that describe TM's with
a special (weird) property: they refuse to accept their own description
as an input string.
Perhaps L_d should be called "the Knockturn Alley" language?
Here is an analogy: suppose I founded a new library that contains
nothing but library catalogs -- it might will include on its shelves
it's own catalog. So of course we would have to include the library's
catalog in its own catalog. In fact we could imagine collecting a list
of libraries that have put their own catalog on their shelves. And perhaps
more interesting put on an exhibit of catalogs of libraries
that don't have their own catalog listed in their catalog.
The problem starts if we decide we want a library of catalogs that
don't list their own catalogs.... do we list this libraries catalog