Published: January 15, 2012

**Keywords:**quantum computing, satisfiability, simulations, Solovay-Kitaev, time-space lower bounds

**Categories:**quantum computing, complexity theory, lower bounds, time-space tradeoff, simulation, SAT

**ACM Classification:**F.1.1, F.1.2, F.1.3, F.2.1, F.2.3

**AMS Classification:**68Q05, 68Q10, 68Q15, 68Q17, 81P68

**Abstract:**
[Plain Text Version]

We give two time- and space-efficient simulations of quantum computations with intermediate measurements, one by classical randomized computations with unbounded error and the other by quantum computations that use an arbitrary fixed universal set of gates. Specifically, our simulations show that every language solvable by a bounded-error quantum algorithm running in time $t$ and space $s$ is also solvable by an unbounded-error randomized algorithm running in time $O(t\cdot\log{t})$ and space $O(s+\log{t})$, as well as by a bounded-error quantum algorithm restricted to use an arbitrary universal set and running in time $O(t\cdot{\rm polylog}{t})$ and space $O(s+\log{t})$, provided the universal set is closed under adjoint. We also develop a quantum model that is particularly suitable for the study of general computations with simultaneous time and space bounds.

As an application of our randomized simulation, we obtain the first nontrivial lower bound for general quantum algorithms solving problems related to satisfiability. Our bound applies to $\rm MajSAT$ and $\rm MajMajSAT$, which are the problems of determining the truth value of a given Boolean formula whose variables are fully quantified by one or two majority quantifiers, respectively. We prove that for every real $d$ and every positive real $\delta$ there exists a real $c>1$ such that either $\rm MajMajSAT$ does not have a bounded-error quantum algorithm running in time $O(n^c)$, or $\rm MajSAT$ does not have a bounded-error quantum algorithm running in time $O(n^d)$ and space $O(n^{1-\delta})$. In particular, $\rm MajMajSAT$ does not have a bounded-error quantum algorithm running in time $O(n^{1+o(1)})$ and space $O(n^{1-\delta})$ for any $\delta>0$. Our lower bounds hold for any reasonable uniform model of quantum computation, in particular for the model we develop.