• Hardware
  • Also: quantum decoherence

Decoherence

The loss of quantum behaviour in a qubit due to unwanted interaction with its environment, causing superposition and entanglement to break down.

Decoherence explains why quantum computing is so difficult. A qubit in superposition is not just fragile in an engineering sense; it is fragile at a fundamental physics level. Any interaction with the environment, including a stray photon, a fluctuating magnetic field, a vibration in the cryostat, or even a cosmic ray, disturbs the qubit’s delicate phase relationship. Once that phase information leaks into the environment, it is effectively lost. The qubit then behaves classically, and the computation fails.

This is not a design flaw that better engineering will eventually eliminate. It is a consequence of quantum mechanics itself: a quantum system coupled to any external degree of freedom will lose its coherence over time.

The details

Decoherence acts through two distinct mechanisms, each with its own timescale:

T1T_1 (energy relaxation): The qubit spontaneously emits energy to its environment and transitions from 1|1\rangle to 0|0\rangle. This is a bit-flip error. Typical values: superconducting qubits achieve T1T_1 of 100500μs100-500\,\mu\text{s}; trapped ions can exceed 10s10\,\text{s}.

T2T_2 (dephasing): The relative phase between 0|0\rangle and 1|1\rangle in a superposition α0+eiϕβ1\alpha|0\rangle + e^{i\phi}\beta|1\rangle is randomized by low-energy environmental fluctuations. The qubit does not flip; it loses the phase structure needed for interference. Dephasing is often the dominant error channel. T22T1T_2 \leq 2T_1 always holds.

The density matrix formalism describes decoherence precisely. A pure superposition state has off-diagonal terms:

ρ=(α2αβαββ2)\rho = \begin{pmatrix} |\alpha|^2 & \alpha\beta^* \\ \alpha^*\beta & |\beta|^2 \end{pmatrix}

As decoherence acts, the off-diagonal terms (coherences) decay exponentially with time constant T2T_2, leaving a mixed state indistinguishable from a classical probability distribution:

ρdecohered=(α200β2)\rho_{\text{decohered}} = \begin{pmatrix} |\alpha|^2 & 0 \\ 0 & |\beta|^2 \end{pmatrix}

Gate budget: If gates take 50ns50\,\text{ns} and T2=100μsT_2 = 100\,\mu\text{s}, you have roughly 2,000 gate operations before the state degrades. Real fault-tolerant algorithms require millions of operations, which is the central motivation for quantum error correction.

Why it matters for learners

Every hardware decision in quantum computing is shaped by decoherence. Superconducting qubits operate at 15mK15\,\text{mK} (colder than outer space) to suppress thermal noise. Trapped ions are suspended in ultrahigh vacuum and manipulated with lasers because they couple weakly to their environment. Photonic qubits travel through fiber at room temperature but are difficult to entangle.

Decoherence is also why quantum error correction is necessary and difficult. You need to detect and fix errors faster than they accumulate, using additional qubits and gates that themselves introduce errors. The fault-tolerance threshold exists precisely because, below a critical error rate, adding more redundancy suppresses the logical error rate. Above that threshold, adding more qubits makes things worse.

Understanding decoherence helps you interpret real hardware benchmarks. When IBM reports T2=200μsT_2 = 200\,\mu\text{s}, or Quantinuum reports two-qubit gate fidelity of 99.9%99.9\%, these numbers are measuring the battle against decoherence on different fronts.

Common misconceptions

Misconception 1: Decoherence only affects individual qubits. Decoherence also degrades entanglement between qubits. Entangled states are often more fragile than product states because noise affecting one qubit can destroy correlations across the whole register. This is called collective decoherence.

Misconception 2: Better cooling alone solves the decoherence problem. Thermal noise is only one source. Charge noise from two-level systems in the substrate, magnetic flux noise from vortices, and electromagnetic radiation from control electronics all contribute to dephasing at temperatures where thermal photons are negligible. Engineers must address all these sources independently.

Misconception 3: Decoherence is the same as measurement. Measurement is an intentional interaction that collapses the qubit and yields a classical outcome. Decoherence is uncontrolled entanglement with the environment that degrades coherence without producing a useful readout. Both collapse superposition, but only measurement gives you information.

See also