Hardness.
Made clear.
A field guide to the resources that bound computation: time, memory, randomness, proofs, communication, and quantum coherence.
It is not about what computers can do. It is about what they can do efficiently.
Computability asks whether an algorithm exists. Complexity asks how expensive that algorithm must be.
A complexity class fixes a computational model and a resource bound, then collects exactly the languages solvable within it. Change the model from deterministic to randomized or quantum, and the universe moves.
time, space, randomness, advice, queries, communication, qubits
decide, output, approximate, sample, verify, interact
Turing machines, circuits, proof systems, randomized algorithms, quantum circuits
Can the resource be removed, simulated, amplified, or proven necessary?
A map of the computational universe.
The famous inclusions are known. The famous separations are mostly not.
How hardness gets proven.
Diagonalization
Construct a machine that disagrees with every smaller machine on at least one input. It gives hierarchy theorems, but hits the relativization barrier for $P$ vs. $NP$.
Reductions
Translate one problem into another in polynomial time. If $B$ is easy, so is $A$; if $A$ is hard, $B$ inherits the burden.
Circuit bounds
A super-polynomial circuit lower bound would separate a problem from $P/\mathrm{poly}$, and therefore from $P$.
Probabilistic proofs
A verifier can be convinced by reading a constant number of proof bits. This became the engine for hardness of approximation.
One more bit can double the work.
The dividing line in complexity is the gulf between polynomial and exponential growth. At realistic input sizes, the difference is not academic; it is the difference between computation and impossibility.
The currency of hardness.
A reduction turns one problem into another. This is how local difficulty becomes a global map.
SAT is NP-complete.
Every $NP$ computation can be encoded as a Boolean formula whose satisfying assignments are accepting executions. SAT became the first complete problem for $NP$.
Karp's 21 problems.
3-Coloring, Hamiltonian Cycle, Subset Sum, Travelling Salesman, Vertex Cover: reductions made hardness portable.
NP-intermediate exists.
If $P \ne NP$, not every problem in $NP$ is easy or complete. Factoring and graph isomorphism are conjectured to live in this middle terrain.
Hardness, amplified.
The PCP theorem shows that even approximating certain problems can be hard. Dinur's proof made the amplification beautifully combinatorial.
A second kind of computer.
A quantum computer is not faster because it has a higher clock speed. It is different because probabilities are replaced by amplitudes: complex numbers that can cancel and reinforce.
BQP
Decision problems solved by polynomial-size quantum circuits with bounded error. Contains $P$, $BPP$, factoring, and discrete log.
QMA
The quantum analogue of $NP$: a polynomial-size quantum proof checked by a quantum verifier. Local Hamiltonian is complete.
QCMA
A classical proof checked by a quantum machine. Separating it from $QMA$ would show that genuinely quantum proofs carry more power.
QIP
Quantum interactive proofs. Surprisingly, quantum messages give no more power than classical interaction at this scale.
PostBQP
Allow postselection on exponentially unlikely measurement outcomes, and quantum computation jumps exactly to $PP$.
MIP*
Entangled multi-prover proofs can recognize every computably enumerable language. Complexity touches logic.
Period finding turns factoring and discrete logarithms from sub-exponential classical problems into polynomial-time quantum ones.
Amplitude amplification searches an unstructured space of size $N$ in $O(\sqrt{N})$ queries, and that gain is optimal.
Hamiltonian simulation treats molecules and materials as native quantum data, making chemistry a natural target.
Random circuits and boson sampling seek advantage in distributions rather than decision problems.
Fault tolerance converts fragile physical qubits into logical qubits. Without it, deep BQP algorithms remain out of reach.
Where the theory touches the world.
Cryptography
Public-key security rests on conjectured hardness: factoring, discrete log, lattices, codes, and hash functions.
Post-quantum security
Shor breaks RSA and elliptic-curve cryptography on large fault-tolerant machines. Lattice and code systems are built around problems with no known efficient quantum attack.
Approximation
If exact optimization is hard, ask how close efficient algorithms can get. PCPs and Unique Games draw the thresholds.
Machine learning
PAC learning, SQ lower bounds, and cryptographic assumptions explain which learning tasks resist efficient algorithms.
Operations research
Linear programming is in $P$; integer programming is $NP$-hard. Real solvers exploit structure that worst-case theory cannot assume.
Databases
Join optimization, treewidth, and fine-grained complexity explain why some queries scale and others hit conditional lower bounds.
Zero knowledge
Interactive proofs became deployed infrastructure through zk-SNARKs, STARKs, arithmetization, and polynomial commitments.
Quantum simulation
Quantum chemistry and materials science are natural workloads because the state being simulated is already quantum.
Questions no one has answered.
Is $P = NP$?
The central open problem in theoretical computer science. Most believe $P \ne NP$; no proof exists.
Can randomness be removed?
The community expects $P = BPP$, but the proof likely requires circuit lower bounds that remain elusive.
Is $BQP$ inside $PH$?
Raz-Tal gave an oracle separation. The unrelativized question remains open and central to quantum advantage.
Quantum PCP.
Can approximating local Hamiltonian ground-state energy remain hard at constant precision?
Matrix multiplication.
The exponent is near $2.371$. Whether $\omega = 2$ would reshape numerical algorithms and graph theory.
NISQ advantage.
Can useful quantum advantage survive realistic noise before full fault tolerance arrives?
Six decades of hardness.
Computational complexity becomes a field.
Time complexity and the first hierarchy theorem formalize the resource view of computation.
SAT is NP-complete.
Theorem proving procedures reveal a complete problem for $NP$.
Hardness becomes portable.
Twenty-one natural problems are shown $NP$-complete by reductions.
Quantum simulation enters the story.
Quantum systems may need quantum computers to simulate efficiently.
The universal quantum computer.
Complexity theory gains a second physical model of computation.
Shor's algorithm.
Factoring is polynomial-time on a quantum computer. Cryptography changes direction.
Grover search.
Unstructured search receives a provably optimal quadratic speedup.
$MIP^* = RE$.
Entangled multi-prover proofs reach the computably enumerable languages.
Matrix multiplication inches downward.
The record reaches $\omega \le 2.371339$, while the lower bound remains stubbornly at $2$.