Lecture 7¶
Variant of Turing Machine¶
There are some variants of the original Turing machine (TM), but they can be proved to be equivalent to TM.
Multitape Turing Machine¶
Definition
A \(k\)-tape Turing machine is a 5-tuple \(M = (K, \Sigma, \delta, s, H)\), where the only difference from TM is
Tip
The idea to prove the equivalence between multitape TM and TM is to treat each element of the single tape of TM as a set of the elements on the corresponding position on the \(k\)-tapes.
Two-way Infinite Tape Turing Machine¶
Definition
A two-way infinite tape Turing machine is a 5-tuple \(M = (K, \Sigma, \delta, s, H)\), where the only difference from TM is
- \(\Sigma\) no longer contains the left end symbol \(\triangleright\).
Tip
The idea to prove the equivalence is to treat the two-way infinite tape TM as a 2-tape Turing machine.
Multi Head Turing Machine¶
Just multiple heads to move.
Two-dimensional Tape Turing Machine¶
Build a mapping of integer to a 2-tuple of integer.
Random Access Turing Machine¶
The step of movement of the head can be multiple.
Well, all of the variants above are trivial. The most important variant is the following one, non-deterministic TM.
Non-deterministic Turing Machine¶
Definition
A non-deterministic Turing machine (NTM) is a 5-tuple \(M = (K, \Sigma, \Delta, s, H)\), where
- \(K\) is a finite set of states.
- \(\Sigma\) is the tape alphabet, containing the left end symbol \(\triangleright\) the blank \(\sqcup\).
- \(s \in K\) is the initial state.
- \(H \subseteq K\) is a set of halting state.
-
\(\Delta\) is a transition relation. It describes in the current state, when read a symbol, what will the next state be and the head action be (move or write).
\[ \delta: (K - H) \times \Sigma \rightarrow K \times (\{\leftarrow, \rightarrow\} \cup (\Sigma - \{\triangleright\})). \]-
Also, there is a limitation of \(\delta\).
\[ \forall q \in K - H, \exists p \in K,\ \ s.t.\ \delta(q, \triangleright) = (p, \rightarrow) \]
-
Definition
The definition of configuration and yields are the same as DTM.
Definition
An NTM \(M\) with input alphabet \(\Sigma_0\) semidecides \(L \subseteq \Sigma_0^*\) if \(\forall w \in \Sigma_0^*\), \(w \in L\) iff \((s, \triangleright \underline{\sqcup} w) \vdash_M^* (h, \dots)\) for some \(h \in H\).
Definition
An NTM \(M\) with input alphabet \(\Sigma_0\) decides \(L \subseteq \Sigma_0^*\) if
- \(\exists N \in \mathbb{N},\ \forall w \in \Sigma_0^*, \text{ there is no configuration } C,\ \ s.t.\ (s, \triangleright \underline{\sqcup} w) \vdash_M^N C\).
- \(w \in L\) iff \((s, \triangleright \underline{\sqcup} w) \vdash_M^* (y, \dots)\).
Tip
You can similarly imagine the yields of NTM as a tree, these conditions are to say that the height of the tree is finite and some branch either accepts or rejects \(w\).
Theorem
Every NTM can be simulated by a DTM.
Proof (sketch)
an NTM semidecides \(L\) → a DTM semidecides \(L\).
Since an NTM yields as a tree, we can use a DTM to simulate it by BFS. Specifically, we use a 3-tape DTM.
- The first one stores the input \(w\) and do nothing.
- The second one simulates the NTM.
- The third one enumerates the choice of BFS.
Church-Turing Thesis¶
Church-Turing Thesis
Algorithms are equivalent to Turing machines. Since algorithms solve decision problems and Turing machines decide language. We can also say decision problems are equivalent to language.
Descriptions of TM¶
- Formal definition \(M = (K, \Sigma, \delta, s, H)\).
- Implement-level description. Diagram.
- High-level description. Pseudo code.
Pseudo Code¶
The input of pseudo code is actually the encoding of the computation object. We need to first discuss about encoding. Here are some facts.
- Any finite set can be encoded.
- Any finite collection of finite sets can be encoded.
For any compuatation object \(O\), we use \(\text{``}O\text{''}\) or \(\langle O \rangle\) to denote its encoding.
Thus FA, PDA and TM can be encoded, since they are finite collections of finite sets.
Example
Make a TM that decides \(L = \{\langle G \rangle : G \text{ is a connected graph}\}\).
\(M\) = on input \(\langle G \rangle\).
- (Default, can be omitted) if the input is illegal, then reject, else decode \(\langle G \rangle\).
- select a node of \(G\) and mark it.
- repeat the following until no new node is marked.
- for each marked node
- mark all of its neighbor.
- if all the nodes are marked,
- accept,
- else,
- reject.
Decidable Problems (Recursive Languages)¶
Example
R1
\(M_{R1}\) = on input \(\langle D, w \rangle\)
- run \(D\) on \(w\).
- if \(D\) accepts \(w\).
- accept \(\langle D, w \rangle\),
- else,
- reject \(\langle D, w \rangle\).
R2
\(M_{R2}\) = on input \(\langle N, w \rangle\)
- build a DFA \(D\) equivalent to \(N\).
- run \(M_{R1}\) on \(\langle D, w \rangle\).
- output the result of \(M_{R1}\).
R3
\(M_{R2}\) = on input \(\langle R, w \rangle\)
- build an NFA \(N\) equivalent to \(R\).
- run \(M_{R2}\) on \(\langle N, w \rangle\).
- output the result of \(M_{R1}\).
R4
\(M_{R4}\) = on input \(\langle D \rangle\)
- if \(D\) has no final state,
- accept,
- else,
- run BFS on the diagram, starting with the inital state.
- if there exists a path from \(s\) to final state,
- reject,
- else,
- accept.
R5
\(M_{R5}\) = on input \(\langle R, w \rangle\)
- construct \(D_3\) such that \(L(D_3) = L(D_1) \oplus L(D_2)\) (symmetric difference).
- run \(M_{R4}\) on \(\langle D_3 \rangle\).
- output the result of \(M_{R4}\).
Reduction
In the examples above, we use the idea of reduction (归约). Take R1 and R2 as example. The equivalence of DFA and NFA guarantees that
We will further discuss reduction in the next lecture.
创建日期: 2023.11.23 10:40:27 CST