Home | Publications | Research Summary | Teaching | Journals | Collaborators |
Theory of Quantum Computation—PHYS 7348
The field of quantum computation exploded in 1994 when Peter Shor published his algorithm that can break RSA encryption codes. Since then, physicists, mathematicians, and engineers have been determining the ultimate capabilities for quantum computation and many quantum algorithms have been established as well. Quantum computation has now fundamentally altered our understanding of computation and complexity theory. Furthermore, it is inevitable that Moore's law will break down, and at this point quantum mechanical effects will be unavoidable. The idea of quantum computation is to harness these effects (rather than avoid them) in order to speed up computations for certain tasks. If you take this course, you will learn about the well known quantum algorithms for factoring integers and database search and in addition you will learn how quantum computation has altered our understanding of computation. The only prerequisites necessary are a course in linear algebra and probability theory (which are standard components in any graduate education in electrical engineering, computer science, mathematics, or physics).
Course SyllabusOffice hours TBA
Homeworks
Lectures
All lectures from Spring 2022 are available on YouTubeLectures on quantum error correction: Lecture 1, Lecture 2, Lecture 3
- linear combination of unitaries for Hamiltonian simulation
- applications of QSVT to Hamiltonian simulation and matrix inversion
- linear combination of unitaries
- quantum eigenvalue transformation
- generalization to QSVT
- methods for block encoding
- overview of quantum singular value transformation (QSVT)
- quantum signal processing of qubit unitaries
- qubitization
- revisiting amplitude amplification
Last modified: March 05, 2023.