Theory of Quantum Computation—PHYS 7895

The field of quantum computation exploded in 1994 when Peter Shor published his algorithm that can break RSA encryption codes. Since then, physicists, mathematicians, and engineers have been determining the ultimate capabilities for quantum computation and many quantum algorithms have been established as well. Quantum computation has now fundamentally altered our understanding of computation and complexity theory. Furthermore, it is inevitable that Moore's law will break down (in fact with the former chief architect at Intel recently suggesting that this will occur within a decade), and at this point quantum mechanical effects will be unavoidable. The idea of quantum computation is to harness these effects (rather than avoid them) in order to speed up computations for certain tasks. If you take this course, you will learn about the well known quantum algorithms for factoring integers and database search and in addition you will learn how quantum computation has altered our understanding of computation. The only prerequisites necessary are a course in linear algebra and probability theory (which are standard components in any graduate education in electrical engineering, computer science, mathematics, or physics).

Course Syllabus

Office hours from 3:00-4:00pm on Tuesdays in 447 Nicholson Hall



Lecture 2

  • notion of algorithm / Turing machine
  • Church-Turing thesis
  • computable functions
  • universal Turing machine
  • halting problem and undecidability
Reading Material: Chapter 3 of Nielsen and Chuang

Lecture 1

  • course overview and discussion of potential final project topics

Last modified: January 21, 2016.