- Algorithms
Born Machine
A Born machine is a generative quantum model where the output probability distribution is defined by the Born rule (probability proportional to squared amplitude), enabling quantum-native generative modeling for tasks like drug discovery and financial simulation.
Classical generative models (variational autoencoders (VAEs), generative adversarial networks (GANs), and diffusion models) learn to produce samples from a target distribution by optimizing a parameterized network. A Born machine takes the same goal but represents the distribution as |psi(theta)|^2, the squared modulus of a parameterized quantum state. Because quantum states can represent exponentially large probability distributions with polynomially many parameters (the circuit parameters), Born machines are theoretically capable of expressing distributions that are hard to represent classically, specifically those whose probability mass function is the output of a deep random circuit, which is #P-hard to sample from classically under standard complexity assumptions.
The quantum circuit Born machine (QCBM) architecture instantiates this idea as a variational quantum circuit. An n-qubit circuit with L layers of parameterized rotations and entangling gates is applied to |0…0>, producing a state whose 2^n computational basis amplitudes define the probability distribution. Samples are drawn by measuring all qubits. The circuit parameters are optimized to minimize a divergence between the model distribution and the training data distribution. Because the Born machine outputs samples (not density estimates), training requires estimators that work from samples on both sides: the classical dataset and the quantum model.
Training Born machines uses either maximum mean discrepancy (MMD) or kernel-based divergences rather than KL divergence, because KL divergence requires evaluating the model’s probability at each data point, exponentially expensive for a quantum model. MMD compares the two distributions through their mean embeddings in a reproducing kernel Hilbert space, requiring only samples from each. The gradient of MMD with respect to circuit parameters is estimated using the parameter-shift rule, enabling gradient-based optimization entirely from measurement outcomes. Recent work has also explored training via adversarial setups where a classical discriminator distinguishes quantum samples from real data.
Applications under active investigation include molecular conformation generation for drug discovery (where the distribution over torsion angles and bond lengths is high-dimensional and multimodal), and financial scenario generation (where realistic joint distributions of asset returns are notoriously hard to fit classically due to fat tails and non-Gaussian correlations). Compared to classical VAEs, Born machines do not require an encoder network and avoid the posterior collapse problem. Compared to diffusion models, they do not require a denoising chain. The practical advantage over classical generative models remains an open research question for near-term hardware, but Born machines provide a natural interface between quantum circuits and probabilistic machine learning.