• Finance

BNP Paribas Quantum Computing for Bank Stress Testing Under Basel IV

BNP Paribas

BNP Paribas partnered with IBM Quantum to accelerate regulatory stress testing under Basel IV capital adequacy requirements, applying iterative quantum amplitude estimation to expected shortfall computation for a multi-factor risk model across thousands of loss scenarios.

Key Outcome
IQAE reduced Basel IV expected shortfall computation from 6 hours to 45 minutes equivalent simulation budget; parallel QPU execution path identified for real-time intraday risk monitoring.

The Problem

Basel IV (the revised Basel III framework, phased in from 2023 to 2025) introduced substantially more demanding capital adequacy requirements for banks. The Fundamental Review of the Trading Book (FRTB) mandates that banks compute Expected Shortfall (ES) rather than Value-at-Risk (VaR) as their primary risk measure; ES captures tail risk more accurately by averaging losses beyond the 97.5th percentile, but requires sampling the deep tail of the loss distribution. For a large bank like BNP Paribas with a diversified trading book across rates, credit, equity, and FX, a compliant ES calculation requires simulating tens of thousands of risk factor scenarios and aggregating losses across the full portfolio.

The classical workflow at BNP Paribas runs a 10-day ES calculation using Monte Carlo simulation on a dedicated HPC cluster. Under normal conditions this takes approximately 6 hours. Under regulatory stress periods, when risk factor volatilities are elevated and more scenarios must be run to achieve stable tail estimates, the runtime can extend to 12 to 18 hours, making intraday risk monitoring difficult and limiting the frequency of regulatory capital updates. The theoretical speedup from quantum amplitude estimation (QAE) is quadratic: achieving a target accuracy epsilon requires O(1/epsilon) quantum circuit evaluations versus O(1/epsilon^2) classical Monte Carlo samples.

Multi-Factor Risk Model and QUBO Encoding

BNP Paribas’s trading book risk model uses a 10-factor Gaussian copula for correlated risk factor moves: 3 interest rate factors (short, medium, long tenor), 2 credit spread factors, 2 equity factors, 2 FX factors, and 1 volatility surface factor. The portfolio loss function is approximated as a piecewise linear function of the 10 risk factors using a delta-gamma approximation. This allows the loss distribution to be computed analytically given the joint distribution of risk factors.

For the quantum implementation, the 10-dimensional Gaussian distribution is encoded into an 18-qubit quantum circuit using Grover-Rudolph state preparation. Each qubit pair represents one discretized risk factor dimension, with the joint distribution captured by the entangled state structure.

from qiskit import QuantumCircuit, QuantumRegister
from qiskit_finance.circuit.library import GaussianConditionalIndependenceModel
from qiskit_finance.applications.estimation import EuropeanCallExpectedValue
from qiskit_algorithms import IterativeAmplitudeEstimation, EstimationProblem
from qiskit_aer import AerSimulator
from qiskit_aer.primitives import Sampler
import numpy as np

# 10-factor risk model encoded with 2 qubits per factor (18 qubits + 1 ancilla)
n_risk_factors = 10
qubits_per_factor = 2
n_state_qubits = n_risk_factors * qubits_per_factor  # 20 qubits

# Correlation matrix for 10 risk factors (simplified 2-factor version for demo)
# Full model uses 10x10 Cholesky decomposition
n_factors_demo = 3
correlations_demo = [0.35, 0.20, 0.15]  # pairwise correlations

gcim = GaussianConditionalIndependenceModel(
    n_normal=n_factors_demo,
    normal_max_value=3.0,
    p_zeros=[0.1, 0.05, 0.08],
    rhos=correlations_demo
)
print(f"GCIM circuit depth: {gcim.decompose().depth()}")
print(f"GCIM qubit count: {gcim.num_qubits}")

# Portfolio loss function: linear combination of risk factor moves
# Loss = sum_i (delta_i * dRF_i) + 0.5 * sum_ij (gamma_ij * dRF_i * dRF_j)
portfolio_deltas = np.array([0.12, -0.08, 0.25])   # risk sensitivities
portfolio_gammas = np.diag([0.02, 0.015, 0.03])    # second-order terms

def portfolio_loss(risk_factor_moves):
    """Compute portfolio loss for given risk factor scenario."""
    linear_loss = np.dot(portfolio_deltas, risk_factor_moves)
    gamma_loss = 0.5 * risk_factor_moves @ portfolio_gammas @ risk_factor_moves
    return linear_loss + gamma_loss

Iterative Quantum Amplitude Estimation for Expected Shortfall

IQAE (Brassard et al., modified by Suzuki et al. 2020) computes the expectation of a function encoded as a quantum oracle without requiring quantum phase estimation, making it compatible with near-term hardware that lacks the depth for QPE. BNP Paribas used Qiskit Finance’s IterativeAmplitudeEstimation with the EstimationProblem wrapping the portfolio loss operator. The target accuracy was epsilon = 0.01 (1% relative error on the ES estimate), requiring approximately 200 IQAE iterations compared to 40,000 classical Monte Carlo samples for the same accuracy.

# Define estimation problem for Expected Shortfall at 97.5th percentile
# Threshold theta encodes the VaR level; ES averages losses above VaR

var_threshold = 0.0275  # normalized VaR at 97.5 percentile

# Objective circuit: marks states where portfolio loss exceeds VaR threshold
class PortfolioLossObjective(QuantumCircuit):
    def __init__(self, n_qubits, threshold):
        super().__init__(n_qubits + 1)
        self.threshold = threshold
        self.n_qubits = n_qubits

    def build(self):
        # Encode loss comparison into ancilla qubit
        # In practice: use comparator circuit from qiskit.circuit.library
        from qiskit.circuit.library import LinearAmplitudeFunction
        breakpoints = [self.threshold, 1.0]
        slopes = [1.0, 1.0]
        offsets = [-self.threshold, -self.threshold]
        return LinearAmplitudeFunction(
            self.n_qubits, slopes, offsets, domain=(0, 1),
            image=(0, 1), breakpoints=breakpoints
        )

# IQAE setup
sampler = Sampler(backend_options={"method": "statevector"})

iae = IterativeAmplitudeEstimation(
    epsilon_target=0.01,
    alpha=0.05,          # confidence level: 95%
    sampler=sampler
)

estimation_problem = EstimationProblem(
    state_preparation=gcim,
    objective_qubits=[gcim.num_qubits - 1],
)

result = iae.estimate(estimation_problem)
es_estimate = result.estimation
confidence_interval = result.confidence_interval_processed

print(f"Expected Shortfall (97.5%): {es_estimate:.4f}")
print(f"95% CI: [{confidence_interval[0]:.4f}, {confidence_interval[1]:.4f}]")
print(f"IQAE circuit evaluations: {result.num_oracle_queries}")

Regulatory Submission Workflow and Runtime Results

BNP Paribas ran the IQAE pipeline on AerSimulator (statevector mode) for validation and benchmarking against the classical Monte Carlo reference. The IQAE approach achieved the target 1% accuracy on the 3-factor demo model in 218 circuit evaluations, compared to 41,000 Monte Carlo samples for equivalent accuracy, a 188x reduction in the number of model evaluations. Scaled to the full 10-factor production model, the equivalent speedup translates to a reduction in compute time from 6 hours to approximately 45 minutes, assuming QPU execution at comparable per-circuit latency to HPC nodes.

The regulatory submission workflow was designed to be auditable: each IQAE run logs the circuit parameters, the number of oracle queries, and the confidence interval to a timestamped ledger compatible with FRTB reporting requirements. BNP Paribas submitted a methodology paper to the European Banking Authority describing the quantum approach as a candidate methodology for regulatory approval under the Internal Models Approach (IMA) of FRTB. The next phase is execution on IBM Eagle 127Q hardware with error mitigation to quantify the performance gap between simulated and real-device results, a critical step before any regulatory body would accept QPU-computed capital figures.

Learn more: Qiskit Reference