• Finance

Commonwealth Bank of Australia Quantum Risk Model Validation

Commonwealth Bank of Australia

Commonwealth Bank of Australia worked with IBM Quantum to validate credit portfolio risk models using Iterative Quantum Amplitude Estimation, demonstrating quadratic speedup potential for regulatory Monte Carlo stress testing.

Key Outcome
IQAE achieved 97% accuracy of classical Monte Carlo with 1,024 circuit evaluations vs 100,000 classical samples; on track for production evaluation with fault-tolerant hardware.

The Problem

Australian prudential regulation (APRA standards aligned with Basel III) requires banks to estimate credit Value at Risk (VaR) and Conditional Value at Risk (CVaR) for their loan portfolios across thousands of stress scenarios. VaR at the 99.9% confidence level answers: what loss will be exceeded only 0.1% of the time? CVaR (also called Expected Shortfall) averages the losses in the worst 0.1% of scenarios, giving a more conservative and coherent risk measure.

Classical estimation of CVaR at 99.9% confidence requires hundreds of thousands of Monte Carlo paths to achieve adequate precision in the tail, because only 0.1% of samples contribute to the CVaR estimate. For a portfolio of 10,000 loans with correlated defaults (driven by sector-level and geographic risk factors), generating 100,000 correlated scenarios and computing losses takes significant compute time. Regulators require this computation to run on overnight batch cycles, leaving little margin for more frequent intraday risk updates.

Quantum Amplitude Estimation (QAE) offers a quadratic speedup: to achieve error epsilon in the CVaR estimate, classical Monte Carlo requires O(1/epsilon^2) samples, while QAE requires O(1/epsilon) circuit evaluations. For the precision levels required in regulatory reporting, this translates to a potential 300x reduction in the number of stochastic evaluations.

IQAE Algorithm and Credit Risk Circuit

CBA used Qiskit Finance’s IterativeAmplitudeEstimation (IQAE), which replaces the phase estimation subroutine of standard QAE with a classical statistical inference loop. IQAE requires no ancilla qubits for phase estimation and achieves the same O(1/epsilon) query complexity, making it far more suitable for current NISQ hardware.

The credit risk circuit encodes a portfolio of N loans. Each loan is represented by a qubit in a superposition state encoding its default probability. A second layer of qubits encodes the conditional correlation structure using a Gaussian factor model: a single “economy qubit” shifts default probabilities of all loans in the correlated direction. The total loss is accumulated in an integer register, and the objective qubit amplitude encodes whether the loss exceeds the VaR threshold.

from qiskit_finance.circuit.library import GaussianConditionalIndependenceModel
from qiskit_finance.applications.estimation import CreditRiskAnalysis
from qiskit_algorithms import IterativeAmplitudeEstimation, EstimationProblem
from qiskit_aer.primitives import Sampler
from qiskit_aer import AerSimulator
import numpy as np

# Portfolio: 8 loans (simplified from CBA's full portfolio for NISQ demonstration)
n_loans = 8

# Loan parameters: probability of default, loss given default, exposure
pd = np.array([0.02, 0.03, 0.015, 0.04, 0.025, 0.05, 0.01, 0.035])
lgd = np.ones(n_loans)  # 100% loss given default (binary loss model)
exposure = np.array([1.0, 2.0, 1.5, 0.8, 1.2, 0.5, 3.0, 1.0])  # in $M

# Gaussian factor model: all loans correlated through single economy factor
# rho_i = sensitivity of loan i to common factor
rho = np.full(n_loans, 0.25)  # 25% common factor sensitivity

# Number of qubits for loss encoding
# Requires ceil(log2(total exposure)) qubits for integer loss register
total_exposure = int(np.ceil(np.sum(exposure)))
n_loss_qubits = int(np.ceil(np.log2(total_exposure + 1)))

print(f"Portfolio loans: {n_loans}")
print(f"Total exposure: ${total_exposure}M")
print(f"Loss register qubits: {n_loss_qubits}")

# Build credit risk circuit using Qiskit Finance
# GaussianConditionalIndependenceModel encodes correlated default probabilities
u = GaussianConditionalIndependenceModel(
    n_normal=1,          # 1 common economic factor
    normal_max_value=3,  # 3-sigma truncation
    p_zeros=pd,          # unconditional default probabilities
    rhos=rho,            # factor sensitivities
)

# CVaR threshold: losses exceeding 99th percentile
cvar_alpha = 0.99  # 99% CVaR

# Estimation problem: amplitude encodes P(loss > VaR_alpha)
# Combined with conditional expectation = CVaR
credit_risk = CreditRiskAnalysis(
    n_normal=1,
    normal_max_value=3,
    p_zeros=pd,
    rhos=rho,
    lgd=lgd,
    k_approx=n_loss_qubits,
    i_normal=list(range(1)),
    i_objective=n_loans + n_loss_qubits,
)

# IQAE for CVaR estimation
sampler = Sampler()
iae = IterativeAmplitudeEstimation(
    epsilon_target=0.01,  # 1% error target
    alpha=0.05,           # 95% confidence interval
    sampler=sampler,
)

problem = EstimationProblem(
    state_preparation=u,
    objective_qubits=[n_loans],  # objective qubit encoding tail probability
)

result = iae.estimate(problem)
cvar_estimate = result.estimation
n_oracle_queries = result.num_oracle_queries

print(f"CVaR estimate (normalized): {cvar_estimate:.4f}")
print(f"Circuit evaluations (IQAE): {n_oracle_queries}")
print(f"Classical equivalent samples for same accuracy: ~100,000")
print(f"Speedup factor: {100000 / n_oracle_queries:.0f}x")

Comparison to Classical 100,000-Path Monte Carlo

The classical benchmark used CBA’s internal Monte Carlo engine: 100,000 correlated Gaussian scenarios, each producing a portfolio loss figure, with CVaR estimated as the mean of losses exceeding the 99.9th percentile. This required approximately 8 minutes of compute time on a standard server for the 8-loan test portfolio.

IQAE using 1,024 circuit evaluations (on Qiskit Aer simulator, validated against IBM Falcon for a subset of runs) achieved a CVaR estimate within 3% of the classical result. The accuracy gap relative to the 97% target is attributable to shot noise in the 1,024-shot measurement budget per circuit evaluation. Increasing the shot budget to 8,192 per evaluation brought accuracy to 99.1% but increased total circuit evaluations to roughly 3,000.

On IBM Falcon hardware, gate noise introduced systematic bias in the estimated amplitude. CBA applied zero-noise extrapolation (ZNE) using Qiskit’s error mitigation module, which corrected the bias to within 5% of simulation results. The overhead of running multiple noise-scaled circuits for ZNE roughly tripled the number of hardware runs required.

Regulatory Context and Production Pathway

APRA’s regulatory framework does not currently specify the computational method for Monte Carlo, only the accuracy and confidence level requirements. CBA’s legal and compliance teams confirmed that a quantum-computed CVaR result meeting the accuracy threshold would satisfy regulatory requirements, removing a potential governance barrier to adoption.

The production pathway identified by CBA requires two advances: fault-tolerant hardware eliminating the need for error mitigation (which currently reduces but does not eliminate the speedup advantage), and a credit risk circuit scaled to the full portfolio of several thousand loans (requiring several hundred logical qubits). CBA projects this milestone aligns with IBM’s published roadmap for early fault-tolerant systems, placing a production evaluation in the late 2020s.

Learn more: Qiskit Finance Reference | Quantum Amplitude Estimation