Lecture Abstracts
NMR Quantum Information Processing
Jonathan Baugh
Nuclear magnetic resonance (NMR) was the first testbed system
in which many of the fundamental ideas of QIP were physically
implemented, such as error correction, teleportation, and benchmarking
of quantum control with non-trivial numbers of qubits (recently
up to 12). The "language" of NMR QIP, which translates
quantum algorithms into RF pulses and spin evolutions, can be
applied in some analogous form to nearly every potential implementation,
and therefore provides a useful conceptual basis for understanding
QIP experiments in a wide variety of systems. I will provide an
introduction to both NMR and NMR QIP, from the rotating frame
and concept of resonance, to implementation of simple algorithms
such as Deutsch-Jozsa.
Advances and prospects in QIP implementations
Jonathan Baugh
The last few years have witnessed an explosion of compelling
experimental advances driven by the ultimate goal of realizing
quantum computers. I will give a partial survey of recent progress
across several fields, including quantum dots, ion traps and superconducting
qubits. Major advances and current challenges will be highlighted.
Quantum Error Correction
Daniel Gottesman
Errors are likely to be a serious problem for quantum computers,
both because they are built of small components and because qubits
are inherently more vulnerable to error than classical bits because
of processes such as decoherence. Consequently, to build a large
quantum computer, we will likely need quantum error-correcting
codes, which split up quantum states among a number of qubits
in such a way that it is possible to correct for small errors.
I will give an overview of the theory of quantum error correction.
I will cover Shor's 9-qubit code, stabilizer codes, and CSS codes.
Hybrid quantum error prevention, reduction,
and correction methods
Daniel Lidar
These two talks will provide an introduction to decoherence-free
subspaces, noiseless subsystems, dynamical decoupling, and hybrid
methods in which they are combined. The emphasis will be on the
underlying unifying symmetry principles which enable quantum errors
to be avoided by encoding. The talks will cover both the theoretical
background and the experimental state of the art.
Quantum Key Distribution: Theory and
Practice
Norbert Lutkenhaus
Quantum Key Distribution solves a problem in the field of cryptography.
In this lecture I will outline the problem of key distribution
and show how quantum mechanics is used solve it. Although usually
formulated abstractly with qubits, I show how these protocols
can be implemented securely with simple optical devices such as
laser sources and threshold photodetectors.
Quantum Computer Algorithms
Michele Mosca
Quantum computer algorithms are able to solve some problems more
efficiently than the best known classical algorithms. For some
"black-box" problems, the quantum improvement is provable.
Feynman's original idea (in the early 1980's) was to use quantum
computers to simulate quantum mechanical systems exponentially
more efficiently than the best known classical algorithms. Shor's
algorithms solve the integer factorization problem and the discrete
logarithm problem, which are at the core of all the widely used
public-key cryptosystems. Grover's quantum search algorithm solves
a black-box searching problem with quadratically fewer queries
than is possible with a classical algorithm. Many more quantum
algorithms and algorithmic tools have been developed since these
seminal results in the mid-1990's.
I will introduce some of the basic principles and tools behind
quantum algorithms, survey some of the main known algorithms,
and discuss future directions.
Implementing quantum information with
light
Barry C. Sanders
We will see how quantum information can be encoded into light:
through polarization, path, timing, frequency, or a combination
thereof. Quantum key distribution is experimentally successful
and provides an excellent example of how photons are excellent
carriers of quantum information so we will study this example
in detail. In addition we will learn how a universal set of optical
quantum gates can be constructed so photonic quantum computers
can be built.
Implementing quantum information in silicon
Barry C. Sanders
Of all the candidates for quantum information media, silicon
(Si) is the most appealing. Si-based quantum information processing
could exploit the enormous investment in Si chips made by the
computer industry. A Si-based quantum chip may be able to talk
easily to standard computer chips. Furthermore, quantum information
can be encoded into nuclear or electron spins of dopants (e.g.,
Phosphorous = P) or as charge qubits whereby one excess electron
is shared between two dopants. We will study Kane's original proposal
for Si:P quantum computing, wherein quantum information is encoded
into the nuclear spin of P. Then we will study the sophisticated
Australian "update" of Kane's scheme. This scheme aims
to build a Si-based quantum computer wherein quantum information
is encoded into the spin of P's outermost electron.
Entanglement
Guifre Vidal
The basic concepts required to describe entanglement will be
introduced: Schmidt decomposition, LOCC, entanglement monotones,
measures of entanglement, ...
Entanglement in quantum many-body systems
Guifre Vidal
Using the concepts introduced in the previous section, I will
explain what makes entanglement in many-body systems (e.g. ground
states of typical local Hamiltonians) very special. Namely, many-body
systems are typically only weakly entangled. The content of this
lecture may serve as a basic introduction to the ideas discussed
by Matthew Hastings in his second and third Distinguished Lectures.
Quantum Computational Complexity
John Watrous
In 1994 Peter Shor discovered efficient quantum computer algorithms
for factoring integers and computing discrete logarithms -- problems
that are conjectured not to be efficiently solvable using ordinary
classical computers. These algorithms are the most well-known
among several examples that suggest that quantum information has
a profound effect on the general notion of computational difficulty.
Quantum computational complexity studies this topic in a variety
of settings that model not only quantum computations, but also
the verification of quantum proofs, quantum interactions of various
types, and their relationships to analogous classical concepts.
In this talk, which will be divided into two parts, I will introduce
the basic notions of quantum computational complexity and survey
some of the main results in this area. The talk will focus on
three fundamental notions in quantum computational complexity:
polynomial-time quantum computations, the efficient verification
of quantum proofs, and quantum interactive proof systems.
Asymptotic theory of quantum communication
Jon Yard
I will lecture on the asymptotic theory of quantum communication.
The goal is to determine how much information can, in principle,
be encoded into a collection of noisy quantum systems so that
it can be retrieved with negligible error as the number of systems
grows. The noise is modeled by a completely positive, trace preserving
linear map on density matrices, otherwise known as a quantum channel,
and we will be interested in the case where the same channel acts
independently on each of the systems. I will primarily focus on
the quantum capacity of a given channel, which measures the number
of qubits per transmission that can be reliably protected. Main
topics to be covered are:
1. Computing the quantum capacity. For certain classes of channels,
we know how to write down an easily computable, closed-form formula
for the quantum capacity. In other cases, the best we have is
an open-form "regularized" expression, which is too
unwieldy to be more than formally useful. I will outline what
is known here and show that quantum capacity has some surprising
behavior as well, such being a non-additive of the channel.
2. I will outline a proof of the LSD (Lloyd, Shor, Devetak) coding
theorem, which shows that asymptotically good codes can be constructed
by selecting a random subspace of the inputs of the channels,
provided the communication rate is less than the quantum capacity.
Two subtopics that will be necessary to cover are:
2.1 Approximate error correction. The usual theory of quantum
error correction focuses on perfectly correcting some set (actually,
a
subspace) of quantum errors. When the noise is modeled by a quantum
channel, one can speak of approximately correcting encoded information
in a meaningful and quantitative way.
2.2 The method of types. A basic tool coming from classical information
theory, statistics and large deviation theory aiding the analysis
of sequences of i.i.d. (independent and identically
distributed) random variables. It is an indispensable tool for
studying channel capacities.
3. Time permitting, I will also discuss capacities for transmitting
classical and private information, and how they are related to
the quantum capacity.
Back to top