- Fundamentals
- Also: QST
Quantum State Tomography
Quantum state tomography is the process of fully reconstructing an unknown quantum state's density matrix by performing measurements in multiple bases, requiring exponentially many measurements as qubit number grows.
A quantum state contains far more information than any single measurement can reveal. A single projective measurement on a qubit yields one bit (0 or 1) and collapses the state, destroying all other information. To learn the full state, a researcher must prepare many identical copies of the state and measure each copy in a different basis. For a single qubit, measuring the expectation values of the three Pauli operators (X, Y, Z) on separate copies is sufficient: these three real numbers, together with the normalization condition, determine the 2x2 density matrix completely. The density matrix rho = (I + r_xX + r_yY + r_z*Z)/2, where the Bloch vector (r_x, r_y, r_z) is recovered from the mean values of the three measurement settings, each requiring hundreds to thousands of shots for statistical precision.
Exponential scaling
For n qubits, the density matrix is a 2^n x 2^n Hermitian positive semi-definite matrix with unit trace. This matrix has 4^n - 1 independent real parameters. Full tomography requires measuring expectation values of all 4^n - 1 independent Pauli operators (tensor products of I, X, Y, Z on each qubit), each setting requiring many shots. For 1 qubit this is 3 settings. For 2 qubits it is 15. For 10 qubits it is over 1 million. Each additional qubit quadruples the number of required measurements, making full tomography practical only for small systems. In current experiments, full tomography is routinely performed up to about 6 to 8 qubits and has been demonstrated up to 10 qubits with significant experimental effort. Beyond that, the number of required state preparations and the classical post-processing cost become prohibitive.
Scalable alternatives
Two approaches make tomographic-style characterization feasible for larger systems. Compressed sensing exploits the fact that states produced by near-ideal quantum devices are low-rank: the density matrix has most of its weight on a small number of eigenvectors. By assuming low rank and using convex optimization (nuclear norm minimization), compressed sensing can reconstruct a rank-r density matrix of an n-qubit system from O(r * 4^n) measurements rather than 4^n, which is a large saving when r is small. Shadow tomography, introduced by Scott Aaronson and later made practically efficient by Huang, Kueng, and Preskill as classical shadows, goes further. It uses random Clifford measurements to build a classical representation of the state that allows estimation of exponentially many observables with only O(log(M)) measurements for M target observables, bypassing the need for full density matrix reconstruction. Classical shadows are now widely used in variational algorithms and quantum chemistry benchmarking.
Role in gate benchmarking
Quantum state tomography is used to characterize quantum operations by preparing a known input state, applying the gate under test, and performing tomography on the output. Comparing the measured output density matrix to the ideal output gives the state fidelity and reveals the structure of errors (coherent rotations, depolarization, amplitude damping). Process tomography extends the same idea to full gate characterization, requiring 4^n input states and 4^n output tomographies for a complete reconstruction of the quantum channel, making it even more expensive than state tomography. Despite the resource cost, tomography remains the gold standard for detailed characterization of small-scale quantum devices where understanding the exact error structure matters more than speed.