- External
- advanced
- Free
Quantum Computation (MIT OpenCourseWare)
One of the most rigorous freely available treatments of quantum computation, taught by the mathematician who invented Shor’s factoring algorithm. Lecture notes and problem sets cover the full graduate syllabus.
MIT’s 18.435J is a graduate-level course that treats quantum computation as a branch of applied mathematics. The emphasis is on mathematical proof and formal complexity theory rather than physical implementation. Reading through the notes and working the problem sets gives a depth of understanding that most shorter courses cannot match.
What you’ll learn
- Linear algebra foundations specific to quantum computation: Dirac notation, density matrices, partial trace, and quantum channels stated precisely
- Quantum complexity classes: BQP (bounded-error quantum polynomial time) and its relationship to P, NP, and PSPACE, including oracle separations
- Grover’s search algorithm in full mathematical depth, including the optimality proof showing O(sqrt(N)) queries is the best any quantum algorithm can achieve
- Shor’s factoring algorithm: the quantum period-finding circuit, the quantum Fourier transform over cyclic groups, and the number-theoretic reduction from period-finding to factoring large integers
- The hidden subgroup problem: the general framework that unifies Shor, Simon, and Deutsch-Jozsa, and what is known about non-abelian instances
- Quantum error correction: stabiliser codes, the Knill-Laflamme conditions for correctability, CSS codes, and fault-tolerant computation
- Quantum cryptography: BB84 key distribution, security proofs, and quantum key distribution protocols
Course structure
The materials are lecture notes, not video lectures. Notes are dense and mathematical, covering roughly one major topic per lecture. Problem sets accompany each section with solutions available, making self-study tractable.
The course opens with quantum mechanics formalism and proceeds through quantum circuits, before spending the majority of its time on algorithms and complexity. Error correction and cryptography occupy the latter portion of the course.
Because there are no videos, working through this course requires genuine engagement with the written proofs. This is a feature rather than a limitation: the skill of reading and verifying mathematical arguments is essential for contributing to research.
Who is this for?
- Graduate students in computer science or mathematics pursuing quantum computing research
- Researchers in adjacent fields who want a rigorous reference for algorithm proofs
- Anyone who has found other quantum courses too superficial and wants the genuine graduate-level treatment
- Self-learners with strong mathematical backgrounds comfortable working from lecture notes
Prerequisites
Strong linear algebra is non-negotiable: eigenvalues, unitarity, tensor products, and spectral decomposition must be fluent. Classical computational complexity at the level of P, NP, and polynomial reductions is required for the complexity sections. Prior exposure to quantum mechanics or quantum computing at any level is helpful but not strictly required given strong mathematical preparation.
Hands-on practice
The problem sets are the core of this course. They include:
- Formal proofs of algorithm correctness and query complexity lower bounds
- Construction of quantum circuits for specific computational tasks
- Analysis of quantum error-correcting codes from generator sets
- Working through the number-theoretic steps in Shor’s algorithm for small examples
- Derivations of complexity class inclusions and oracle separations
Solutions to problem sets are available on the OCW site, which makes self-assessment possible without an instructor.
Why take this course?
Most online quantum computing resources are either too shallow or too focused on a specific framework like Qiskit or PennyLane. This course provides the mathematical foundation that makes it possible to read primary research literature, evaluate algorithm claims, and develop new ideas.
The pedigree is unmatched: the lecture notes were written or supervised by Peter Shor, who invented the most famous quantum algorithm in existence. The mathematical treatment reflects how professional quantum computing researchers actually think about these problems.
For anyone serious about quantum algorithms research or quantum complexity theory, this is required reading. The free availability on OCW makes it accessible regardless of institutional affiliation.
Topics covered
Similar Courses
Other courses you might find useful