>> [CNS] >> [Comp Sci Dept] >> [R J Botting] >> [CS546/646] >> 02 [Text Version]
[About] [Contact] [Grades] [Projects] [Submit] [Schedule] [Syllabus] [Search ]
Session: [01] (02) [03] [04] [05] [06] [07] [08] [09] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20]
Thu Jun 8 13:02:47 PDT 2006

# Session 02 Math, Method, & Madness

## Section 1.2 Proof

### Ch 1 pp 1-35 : Question for Ch. 1

Ch. 1 mentions many methods for constructing proofs. Are there any other methods of proof construction that are not covered in the text?

The book doesn't discuss the use of algebra. This is because it is not that useful in most of automata and computability theory.

Most disciplines have their own special techniques and short cuts. For example: "Proof by Intimidation" is used in many human endeavors. "The Proof of the Pudding" may be the most popular method outside of academe.

Here is a list of bad proofs that all odd numbers are prime.

Mathematician: 1 is prime, 3 is prime, 5 is prime, 7 is prime, therefore, by induction, all odd numbers are prime.

Physicist: 1 is prime, 3 is prime, 5 is prime, 7 is prime, 9 is a bad data point, 11 is prime, 13 is prime...

Engineer: 1 is prime, 3 is prime, 5 is prime, 7 is prime, 9 is approximately prime, 11 is prime, 13 is prime...

Lawyer: 1 is prime, 3 is prime, 5 is prime, 7 is prime, 9 is lying, 11 is prime, 13 is prime...

Computer Scientist: 1 is prime, 1 is prime, 1 is prime, 1 is prime, 1 is prime, ...

With thanks to [ Smart.html ] , (for more math humor try [ jokes.html ] )

This book has enough methods for this class -- I hope.

### Ch 1 pp 1-35 : Proof Methods

What is the best way to choose a proof method?

Look for patterns in the statement of the problem.

Experience helps.

### Ch 1 pp 1-35 : Proof Methods

Is shortest possible way to prove always the best method?

Usually -- if you can find it.

Their is one catch: the level of detail, rigor, and formality depends very much on your audience. In this class your audience is me and your colleagues. The book provides a good picture of the level of rigor, detail, and formality I expect in this class (in some courses I much more informal, and in other more formal).

### Ch pp 11 : Statements With Qualifiers

Surely you mean
` 			QuaNTifiers`

### Ch pp 11 : Statements With Quantifiers

How are "for all" and "there exists" quantifiers used in proofs? How and when are they used in each statement of a proof?

If you have a statement that "something exists with property P" there is only one deduction you can validly make. Since you can not be sure which thing is the "something" you must give it a name by using an unused (free) variable. So for example, given

` 		for some x, x is a distributed widget.`
you can validly deduce
` 		v is a distributed widget`
as long as "v" is a free variable at that point in the argument.

But you can not argue

` 		3.14159 is a distributed widget.`
or use any other constant, formula, name, or bound variable in place of x.

Universal quantifiers let you substitute any expression, variable, or value (of the right type) for the unknown. So if we have deduced

` 		for all widgets x, if x is round then x is distributed.`
and you know that fubar is a widget then you can deduce
` 		if fubar is round then fubar is distributed.`

In my experience, the freedom to substitute anything for a universally quantified variable means you often need inspiration or luck to substitute the value/expression that proves the result you want.

Proving "for all": I use a block structure that starts by declaring a new variable, then deduce the result I want and then close the proof with the claim of success:

To prove that "all round widgets are distributed":
Let

1. x be a round widget
2. ...
3. ...
4. (above)|-x is distributed.

(Close Let )

1. (above)|-all round widgets are distributed.

Sometimes an expanded form is used
Let

1. x be a widget
2. (let)|-x is round.
3. ...
4. ...
5. (above)|-x is distributed.

(Close Let )

Proving a "for some" -- you have two approaches. (1) find an example that fits. Or (2) prove by contradiction:
Let

1. not for some x, foo(x),
2. (above)|-for all x, not foo(x),
3. ...
4. ...

(Close Let )

2. (above)|-for some blah, foo(blah)

For more on this see [ ../maths/logic_2_Proofs.html#For all and some ]

### Ch 1 pp 14-15 : if-then statements

There are four combinations of "If H then C":
(if_then):
1. H=True, C=True
2. H=True, C=False
3. H=False, C=True
4. H=False, C=False

According to the book, #1 evaluates to true and #2 evaluates to false, which both make sense to me. Can you explain why #3 and #4 evaluate to true as well?

### Ch 1 pp 6-8 : Deductive Proofs

What is a deductive Proof. I do not understand the difference of the deductive proof and the proof by induction.

A deductive proof is a series of steps where each proposition is derived by more or less rigorous rules from one or more previous steps. I have tabulated a set of such steps [ ../maths/logic_25_Proofs.html ] as part of my research on expressing rigorous math in ASCII and HTML.

I document steps like this

3. (evidence)|- (label): statement.

The evidence is a list of previous labels.

Proof by induction is a special form of proof that works with integers and strings. It always has two parts: basis and step:

1. Prove Basis for n = 1.
2. Prove step: if true of n then it is true of n+1.
Let
1. (induction_hypothesis): it is true of 1.
2. derive truth of n+1

(Close Let )

See [ Section 1.4 Induction ] below.

### 1.2.3 If-then and iff

There is a variation to deductive proof at is used to prove statements that have this form
4. if P then Q

You prove it by assuming P as a temporary hypothesis and deducing Q from it. I document this kind of argument using ".Let"... and in HTML you see
Let

1. P1.
2. (let)|-P2.
3. ...
4. ...derivation
5. (above)|-Q.

(Close Let )
The Ps are the hypotheses (hyp) and Q is the conclusion that proves
5. if P1 and P2 and ... then Q.

## Section 1.4 Induction

How to prove a proof using the induction process?

What are the general principles behind Proof by induction? In my previous class' cs 431 and cs 500, we more or less focused on proof by construction or contradiction.

I am bit rusty on the Proof By Induction, i know it, i understand it, but i have a problem implementing it on the problems of higher complications.

So my question is, may you give us a hands on explanation of proofs by induction on a more complicated question.

### Ch 1 pp : Proof and Homework

Can you also explain Question 2 of the Ex Sheet. AND IF POSSIBLE go over all the homework questions.

### Ch 1 pp 22-23 : example 1.18 (Induction)

I don't understand why the strategy for the induction step was to subtract 3 from n+1.

I think this is because the basis step established 3 cases: 8,9, and 10.

If we could subtract 3, could we subtract 5 as well to prove the induction step?

You would then have to base the induction on 5 cases: 8,9,10,11,and 12.

At the end of the induction step, we have n + 1 = 3 + 3a + 5b which n + 1 can be written as a + 1 3's and b 5's. That statement does not prove S(n + 1).

### Ch 1 pp 023-024 : Induction, Structural Induction

While doing Finite Automata or Turing machines perhaps, would you please go over Structural Induction in some more detail. We use it to prove several theorems but I'm as confident on structural as perhaps on regular induction.

I think we are going to do lots of these during the rest of the class.

### Ch 1 pp 27-27 : Mutual Inductions

Why do parts three and four of the basis step prove anything?

Looks like those cases of if_then are causing a problem here.

## Section 1.5 Strings

### Ch 1 pp 29-29 : Powers of an Alphabet

I think I understand how to calculate certain powers of an alphabet, but in what context is this useful? Could you give an example where such a calculation is used?

The powers of an alphabet generate strings of different lengths. Thus they let us think and talk about strings. Which is what we need to discuss languages, input into a computer, output out of a machine,...

Here is another example: we can take all the words in the English language to be an alphabet -- call it W. Then:

` 		The cat sat on the mat`
is a member of W^6.

Notice that we can use powers of strings to express repetitions:

1. "banana" = "ba" "na"^2

We can also apply powers to sets of strings (languages, problems...):

2. {"ba", "na"}^2 = {"bana", "baba", "naba", "nana"}.

Exercise: find k such that "banana" is in {"ba", "na"}^k.

## "Big O notation" @ Wikipedia

What is the difference between the "linear" O(n) and "sub-linear" o(n) orders? I'm tempted to say: we won't us o(.) in this class...

If I recall correctly functions that are o(f(n)) are strictly less than a constant time f(n) for large enough n. So o(n) doesn't include functions like 3n+1 which is O(n).

## Big O Notation

If Big O Notation is the "Asymptotic Upper Bound," why is O of 4x^4 O(x^4) when x^4 is not the upper bound of 4x^4?

I think the words "upper bound" are misleading -- a metaphor.

So 4x^4 is O(x^4) even tho' 4x^4 > x^4 for all x except 0!

## Ch 1 pp : BigO Notation

The question is? What type problems are related to the following different notations Ù(n), ù(n), È(n). Is one better than other?

The symbols didn't transfer very well.... help.

## Graph Theory and Induction

I tried to do question 5(a) as an exercise in probability theory, for some reason I could not "port" it to induction. Would you please see where I'm missing the link.

We will avoid probability until the end of the course!

## wikipedia : Graph Theory

When I was browsing Wikipedia's data graph theory, I ran across the Knight's tour problem, which is a subset of the graphs of the TSP problem which can be solved in linear time: [ KnightsTour.html ]

I imagine there are many other categories of graphs which also have special properties that allow them to have their Hamiltonian paths found in polynomial time.

So, even though we haven't been able to solve the TSP problem in the general case in polynomial time (yet), is it considered practical to attempt graph classification before applying heuristics to discover if a graph can be solved in polynomial time? Or is graph classification an intractable problem as well?

Probably.... by the end of the class I think you will know the answer to this.

# Ch 1+math pp : Deadline for class 02

Fewer points if your message is delivered after this one.

# Standard Definitions

1. FSA::="Finite State Acceptor/Automata", a theoretical machine with a finite set of states.
2. ND::="Non-deterministic", a machine that can try out many possibilities and always guesses the right choice. Note: For a given problem a ND solution is simpler and easier to understand than a deterministic one. We study ND machines to discover how to use them as high-level designs for more complicated deterministic machines. Plus it's fun.
3. PDA::="Push down Automata", a machine with a finite control and a single stack for remembering intermediate data. Kind of like half a TM.
4. RJB::=The author of this document, [ ../index.html ]
5. |-RJB="Richard J Botting, Comp Sci Dept, CSUSB".
6. Schedule::= See http://www.csci.csusb.edu/dick/cs546/schedule.html.
7. Search::= See http://www.csci.csusb.edu/dick/cs546/lookup.php.
8. Syllabus::= See http://www.csci.csusb.edu/dick/cs546/syllabus.html, and also see [ syllabus.html ] (my generic syllabus).
9. TBA::="To Be Announced".
10. TM::="Turing Machine".
11. NTM::="Nondeterministic Turing Machine".
12. nondeterministic::="A machine that can make choices as it runs and accepts an input iff any sequence of choices leads to acceptance."
13. DTM::="Deterministic Turing Machine".
14. runtime::=run_time,
15. run_time::="The number of steps a TM takes before it halts when start on a given input".
16. complexity::="The largest run_time for any word with a given length" , usually expressed as an assymptotic (Big-O) formula.
17. P::@problem=class of problems that a DTM can compute in polynomial_time.
18. NP::@problem=class of problems that a NTM can compute in polynomial_time.
19. polynomial_time::=An algorithm/TM is said to halt in polynomial time if the number of steps given n items of data is less than O(n^c) for some constant c.

# From Logic

20. LOGIC::= See http://www.csci.csusb.edu/dick/maths/logic_25_Proofs.html

21. (LOGIC)|- (ei): Existential instantiation -- if P is true of something then we can substitute a new variable for it.
22. (LOGIC)|- (eg): existential generalization -- if P is true of this thing then it is true of something!
23. (LOGIC)|- (given): a proposition that is assumed as part of a Let...End Let argument. Typically if A and B and C then D starts with assuming A,B, and C are given. They are called the hypotheses.
24. (LOGIC)|- (goal): a proposition that we wish to prove. Typically the part after the then in an if...then... result. This is called the conclusion.
25. (LOGIC)|- (def): By definition of a term defined above.
26. (LOGIC)|- (algebra): Either by using well known algebraic results, or by use a series of algebraic steps.
27. (LOGIC)|- (RAA): Reduce to absurdity. The end of a Let/Po/Case/Net that establishes the negation of the last set of assumptions/hypotheses.