edX Machine Learning for Semiconductor Quantum Devices
  • 6–7 hours per week
  • advanced
  • $185
Machine Learning for Semiconductor Quantum Devices
  • edX
  • advanced
  • $185

Machine Learning for Semiconductor Quantum Devices

★★★★★ 4.6/5 provider rating 6–7 hours per week By Delft University of Technology (QuTech)

Explore the intersection of artificial intelligence and quantum hardware. This course teaches how to use machine learning to automate the control and calibration of semiconductor quantum chips - one of the most practically significant bottlenecks on the path to scalable quantum computing.

Part of the Quantum 301: Quantum Computing with Semiconductor Technology professional certificate. Taught by QuTech researchers who apply these techniques to real quantum devices in their own laboratory work.

What you’ll learn

  • The auto-tuning problem: why calibrating a quantum dot qubit manually requires tens of parameters to be set simultaneously and why this does not scale
  • Charge stability diagrams in depth: what they are, how they are measured, and what machine learning needs to classify in them
  • Supervised classification: training convolutional neural networks to identify charge occupancy regimes from stability diagram images
  • Active learning: intelligently selecting which measurements to make next in order to build a model efficiently from limited data
  • Bayesian optimisation: using Gaussian process surrogate models to search for optimal gate voltages without exhaustive grid search
  • Reinforcement learning: training agents to adaptively control qubits by treating calibration as a sequential decision problem
  • Model validation: how to evaluate whether an ML-based calibration system is working correctly and robustly
  • Integration into hardware control stacks: how ML tools fit into the full quantum hardware software infrastructure
  • Open-source tools and datasets: the software libraries and public datasets used in semiconductor qubit ML research

Course structure

The course runs at six to seven hours per week. It assumes both ML knowledge and quantum hardware context, so unlike most introductory courses it moves directly into material at the research frontier.

The first module frames the problem carefully: a current state-of-the-art Ge qubit device has dozens of gate voltages that must all be set correctly for each qubit to operate. As qubit counts grow, the calibration space grows exponentially. The module includes real examples of how long manual calibration takes and why it is already a rate-limiting step in experiments.

The classification modules cover the stability diagram recognition problem in depth: what features must be identified, what training data looks like, how convolutional neural networks are applied, and how to deal with distribution shift between different devices.

Bayesian optimisation receives its own module because it is the most widely used technique in current quantum device tuning pipelines. You learn Gaussian processes as surrogate models, acquisition functions (expected improvement, upper confidence bound), and how to implement a basic Bayesian optimisation loop.

The reinforcement learning module covers the adaptive control framing: treating gate voltage adjustment as a Markov decision process and training an agent to optimise qubit quality metrics.

Who is this for?

  • Machine learning engineers and data scientists who want to apply their skills to quantum technology
  • Quantum hardware researchers who want to deploy ML in their calibration workflows
  • Graduate students working at the intersection of ML and experimental quantum computing
  • Anyone pursuing the Quantum 301 professional certificate from Delft University

Prerequisites

Solid machine learning foundations are required: supervised learning, neural networks, model evaluation, and some exposure to probabilistic methods. Python programming is needed for all exercises. Quantum hardware context from The Hardware of a Quantum Computer or the Germanium Technologies course is important - the course assumes you understand what a charge stability diagram is and why you need to classify it. This is an advanced course combining two demanding disciplines.

Hands-on practice

All exercises are Python-based with realistic quantum device datasets:

  • Train a classifier to identify charge regimes in stability diagram images using PyTorch or similar
  • Implement a Gaussian process model and run Bayesian optimisation for gate voltage search
  • Build a basic reinforcement learning agent and train it on a qubit control simulator
  • Validate model performance using cross-device generalisation tests

Assessment includes graded programming assignments. The datasets are structured like those used in actual QuTech laboratory calibration workflows.

Why take this course?

The qubit calibration problem is one of the most practically significant bottlenecks on the path to scalable quantum computing. Every major quantum hardware company - Google, IBM, Intel, IQM, QuTech - invests heavily in automated tuning. ML engineers who understand both the machine learning and the quantum hardware context are exceptionally rare and in increasing demand.

This course provides that specific combination, grounded in the techniques used by active researchers at one of the world’s leading quantum hardware groups. The combination of ML expertise with quantum device knowledge is a career differentiator in the quantum technology industry.

Topics covered

Similar Courses

Other courses you might find useful