Logo notas.itmens

Preliminaries

The Big-O Notation#

Given two functions \(f,g: \mathbb{N}\to\mathbb{N}\), we say that

  • \(f=O(g)\) and \(g=\Omega(f)\) if there exists a constant \(c\) s.t. \(f(n)\leq c\cdot g(n)\) for every sufficiently large \(n\).
    • \(f=\Theta(g)\) if \(f=O(g)\) and \(g=O(f)\).
  • \(f=o(g)\) and \(g=\omega(f)\) if for every \(\epsilon>0\)\(f(n)\leq \epsilon\cdot g(n)\) for every sufficiently large \(n\).

Hence,

  • \(f=O(g)\) means that \(f\) is asymptotically bounded above by \(g\).
  • \(f=\Omega(g)\) means that \(f\) is asymptotically bounded below by \(g\).
  • \(f=o(g)\) means that \(f\) is asymptotically dominated by \(g\).
  • \(f=\omega(g)\) means that \(f\) asymptotically dominates \(g\)

Running Time#

A Turing machine \(M\) is said to compute \(f\) in \(T(n)\)-time if its computation on every input \(x\) requires at most \(T(|x|)\) steps (\(x\in\{0,1\}^\ast\)) for every sufficiently large \(|x|\).

Hence a function \(f\) is computable in \(T(n)\)-time if there exists a Turing machine that computes \(f\) in \(T(n)\)-time. We denote by \(\text{TIME}_{\mathsf{TM}}(T(n))\) to be the set of Boolean functions that are computable in \(T(n)\) time.

Efficient universal Turing machine theorem#

Along with the classical statement of universal Turing machine theorem:

There exists a Turing machine \(\mathcal{U}\) such that for every \(x,\alpha \in \{0,1\}^\ast\) we have \(\mathcal{U}(x,\alpha) = M_\alpha(x)\) where \(M_\alpha\) is the Turing machine represented by \(\alpha\).

We can add the statement about running time:

If \(M_\alpha\) halts on input \(x\) within \(T\) steps, then \(\mathcal{U}(x,\alpha)\) halts within \(cT\log T\) steps where \(c\) is a number depending only on \(M_\alpha\)'s alphabet size, number of tapes, and number of steps.


Computational Problems#

In a computational problem, we are given an input that can without loss of generality assumed to be encoded over the alphabet \(\{0,1\}\), and we want to return an output a solution satisfying certain property. A computational problem is then described by the property that the output has to satisfy given the input.

An arbitrary but natural and useful classification of computational problems (in fact all can be regarded as search problems, so it is arbitrary):

  • Search problem: A search problem is specified by a relation \(R\subseteq \{0,1\}^\ast\times\{0,1\}^\ast\), where \((x,y)\in R\) iff output \(y\) is an admissible answer given the input \(x\), if such \(y\) exists. 
  • Decision problem: Given a set (called language\(L\subseteq \{0,1\}^\ast\), we specify a decision problem by requiring that for input \(x\in L\), the output should be \(1\), and \(0\) otherwise. A machine decides a language \(L\) if it computes the function \(f_L:\{0,1\}^\ast \to \{0,1\}\) where \(f_L(x) = 1\) iff \(x\in L\)
  • Optimization problem.
  • Counting problem.

We may define the class \(\text{DTIME}\) (Deterministic time) given the notion of decision.

Let \(T:\mathbb{N}\to\mathbb{N}\) be some function. A language \(L\) is in \(\text{DTIME}(T(n))\) iff there is a deterministic Turing machine that runs in time \(c\cdot T(n)\) for some constant \(c>0\) and decides \(L\) on input length \(n\).

The class \(\text{NDTIME}\) is (more preciesly are since \(\text{NDTIME}(T(n))\) is a class, not \(\text{NDTIME}\) itself) simply the above notion with deterministic replaced by nondeterministic.


Polynomial Time (Karp) Reduction#

Given two Boolean functions \(f,g:\{0,1\}^\ast\to\{0,1\}\), we say that \(f\) reduces to \(g\),denoted by \(f\leq_p g\), if there is a polynomial time computable \(R:\{0,1\}^\ast\to\{0,1\}^\ast\) such that for every \(x\in\{0,1\}^\ast\),

\[f(x) = g\circ R(x),\]

and we say that \(f,g\) have equivalent complexity if \(f\leq_p g\) and \(g\leq_p f\).

Some texts use the name “many-to-one reducibility” for polynomial time reducibility.

Replacing with computable function for the polynomial time computable function, the notion of many-one reducibility becomes the corresponding one in recursion theory, and we use $\leq_m$ instead of $\leq_p$. Essentially, notions in computational complexity theory are the specialized/restricted notions of those in recursion theory.

Extension to languages#

It is evident that by taking \(L\) to be \(f^{-1}(1)\) and \(L'\) to be \(g^{-1}(1)\) the notion of polynomial time reduction can be extended from Boolean functions to languages. A language \(L\) is polynomial time Karp reducible to a language \(L'\), denoted \(L\leq_p L'\), if there is a polynomial time computable function \(f:\{0,1\}^\ast\to\{0,1\}^\ast\) s.t. for every \(x\in \{0,1\}^\ast\)\(x\in L\) iff \(f(x)\in L'\).

Extended Church-Turing Thesis#

The extended Church-Turing thesis says that:

Every physically realizable computation model can be simulated by a Turing machine with polynomial overhead.

Similar to the Church-Turing thesis, this is a belief. To date many physically realizable computational models have been proposed, and they're shown to be simulatable by a Turing machine with at most polynomial slowdown (\(t\) steps on the model can be simulated in \(t^n\) steps on the Turing machine, where \(n\) is a constant that depends upon the model). (Note that we cannot say that there is a polynomial time reduction since polynomial time reduction is on functions, not on models.)

A possible violation of the thesis may be quantum computers: they do not apepar to be efficiently simulatable on TMs; but whether they can be physically realized is still unclear.