- Fundamentals
- Also: Noisy Intermediate-Scale Quantum
- Also: NISQ era
NISQ
Noisy Intermediate-Scale Quantum, the current era of quantum computing, characterised by 50–1,000 physical qubits with no error correction and limited circuit depth.
NISQ stands for Noisy Intermediate-Scale Quantum. John Preskill coined the term in his 2018 paper “Quantum Computing in the NISQ Era and Beyond,” and it stuck because it described exactly where the field was: quantum computers that exist and can do things, but that are too noisy for error correction and too small for the algorithms that would matter most.
The defining property is not the qubit count; it is the absence of error correction. NISQ machines run computations directly on physical qubits, with all their noise and imperfection, and hope the results are useful before errors overwhelm the signal.
The details
NISQ machines have three defining characteristics:
Noisy: Gate error rates are typically per operation. Two-qubit gates are noisier than single-qubit gates. Readout errors add further noise. With no error correction active, errors accumulate linearly with circuit depth.
Intermediate-scale: Current devices range from tens to a few thousand physical qubits. IBM’s Heron processor (2023) has 133 qubits; their Condor (2023) has 1,121. Google’s Willow (2024) has 105. These numbers are too small and too noisy for fault-tolerant algorithms like Shor’s, but large enough for exploratory experiments.
Limited circuit depth: The useful circuit depth is bounded by coherence time divided by gate time. For superconducting qubits with and gate times around , the gate budget is roughly operations before noise dominates. Real algorithms often require millions of operations.
The algorithms proposed for NISQ hardware are designed around these constraints:
- VQE (Variational Quantum Eigensolver): Hybrid quantum-classical loops with short circuits for quantum chemistry
- QAOA (Quantum Approximate Optimization Algorithm): Short-circuit optimization heuristics
- Quantum machine learning: Various proposals for quantum-enhanced learning, though classical competition is fierce
The honest assessment of NISQ utility is mixed. Classical algorithms have improved faster than expected, and classical simulators can match NISQ performance on many proposed benchmark problems as the system size increases. The question “will NISQ find a genuinely useful application before fault-tolerant hardware arrives?” remains open as of 2026.
Why it matters for learners
Understanding the NISQ concept helps you read quantum computing news critically. Claims of quantum speedup, quantum advantage, and breakthrough results need to be evaluated in light of NISQ constraints:
- Is the benchmark designed specifically to be hard for classical computers, or does it solve a useful real-world problem?
- Is the circuit depth within the NISQ gate budget, or does it require error correction that does not yet exist?
- Have the results been validated by independent groups, or are they vendor-produced benchmarks?
NISQ is also the context in which most quantum courses and tutorials operate. When you write Qiskit circuits and run them on IBM’s real hardware, you are working on a NISQ device. Understanding the noise model, error rates, and circuit depth limits is essential for interpreting real hardware results versus simulator results.
The transition from NISQ to fault-tolerant quantum computing is the central goal of the field, and understanding NISQ limitations is what makes the engineering challenges of FTQC legible.
Common misconceptions
Misconception 1: NISQ computers are useless. They are useful for research, education, and exploring near-term algorithms. Some quantum chemistry results on small molecules show genuine agreement with experiment. The claim is not that NISQ is useless; it is that achieving quantum advantage on a practically useful problem has proven harder than initial predictions suggested.
Misconception 2: More qubits means better NISQ performance. On NISQ hardware, adding more qubits often introduces more noise through crosstalk and longer connectivity paths, which can make results worse for some circuits. The quality of each qubit and the connectivity pattern matter as much as total count.
Misconception 3: NISQ will become FTQC just by improving existing hardware. Fault-tolerant quantum computing requires not just better qubits but a fundamentally different architecture: active syndrome extraction, real-time classical decoding, and fault-tolerant gate compilations. This represents a qualitative architectural shift, not a gradual improvement of the same design.