Quantum Computing
Quantum
computing is the area of study focused on developing computer technology based
on the principles of quantum theory,
which explains the nature and behavior of energy and matter on the quantum
(atomic and subatomic) level. Development of a quantum computer,
if practical, would mark a leap forward in computing capability far greater
than that from the abacus to a
modern day supercomputer,
with performance gains in the billion-fold realm and beyond. The quantum
computer, following the laws of quantum physics, would gain enormous processing
power through the ability to be in multiple states, and to perform tasks using
all possible permutations simultaneously. Current centers of research in
quantum computing include MIT, IBM, Oxford University, and the Los Alamos
National Laboratory.
The
essential elements of quantum computing originated with Paul Benioff, working
at Argonne National Labs, in 1981. He theorized a classical computer operating
with some quantum mechanical principles. But it is generally accepted that
David Deutsch of Oxford University provided the critical impetus for quantum
computing research. In 1984, he was at a computation theory conference and
began to wonder about the possibility of designing a computer that was based
exclusively on quantum rules, then published his breakthrough paper a few
months later. With this, the race began to exploit his ideas. However, before
we delve into what he started, it is beneficial to have a look at the
background of the quantum world.
The Essential Elements of Quantum Theory:
- Energy, like matter, consists of discrete units, rather than solely as a continuous wave.
- Elementary particles of both energy and matter, depending on the conditions, may behave like either particles or waves.
- The movement of elementary particles is inherently random, and, thus, unpredictable.
- The simultaneous measurement of two complementary values, such as the position and momentum of an elementary particle, is inescapably flawed; the more precisely one value is measured, the more flawed will be the measurement of the other value.
Entanglement
is
a physical phenomenon that occurs when pairs or groups of particles
are generated or interact in ways such that the quantum state
of each particle cannot be described independently – instead, a quantum state
may be given for the system as a whole.
Measurements of physical properties such as position, momentum,
spin, polarization, etc. performed on entangled
particles are found to be appropriately correlated.
For example, if a pair of particles is generated in such a way that their total
spin is known to be zero, and one particle is found to have clockwise spin on a
certain axis, then the spin of the other particle, measured on the same axis,
will be found to be counterclockwise. Because of the nature of quantum
measurement, however, this behavior gives rise to effects that can appear paradoxical:
any measurement of a property of a particle can be seen as acting on that
particle (e.g. by collapsing a number of superimposed states); and in the case of
entangled particles, such action must be on the entangled system as a whole. It
thus appears that one particle of an entangled pair "knows" what
measurement has been performed on the other, and with what outcome, even though
there is no known means for such information to be communicated between the
particles, which at the time of measurement may be separated by arbitrarily
large distances.
Qubit
In
quantum computing, a qubit or quantum bit is
a unit of quantum information—the quantum analogue of the
classical bit.
A qubit is a two-state
quantum-mechanical system,
such as the polarization of a single photon: here
the two states are vertical polarization and horizontal polarization. In
a classical system, a bit would have to be in one state or the other, but
quantum mechanics allows the qubit to be in a superposition of both states at the same time, a
property which is fundamental to quantum computing.
Quantum Gate
In
quantum computing and specifically the quantum circuit
model of computation, a quantum gate
(or quantum logic gate) is a
basic quantum circuit operating on a small number of qubits. They
are the building blocks of quantum circuits, like classical logic gates
are for conventional digital circuits. Unlike many classical logic gates,
quantum logic gates are reversible. However, classical computing can be
performed using only reversible gates. For example, the reversible Toffoli gate
can implement all Boolean functions. This gate has a direct quantum equivalent,
showing that quantum circuits can perform all operations performed by classical
circuits. Quantum logic gates are represented by unitary matrices.
The most common quantum gates operate on spaces of one or two qubits, just like
the common classical logic gates operate on one or two bits. This means that as
matrices, quantum gates can be described by 2 × 2 or 4 × 4
unitary matrices.
Shor's algorithm
Shor's algorithm,
named after mathematician Peter
Shor, is a quantum
algorithm (an algorithm that runs on a quantum
computer) for integer factorization formulated in 1994. Informally it solves the following
problem: Given an integer N, find its prime
factors.
On
a quantum computer, to factor an integer N, Shor's algorithm runs in polynomial
time (the time taken is polynomial in
log N, which is the size of the input).[1] Specifically it takes time O((log N)3), demonstrating that the integer
factorization problem can be efficiently solved on a quantum computer and is
thus in the complexity class
BQP.
This is substantially faster than the most efficient known classical factoring
algorithm, the general number field sieve, which works in sub-exponential time — about O(e1.9 (log N)1/3 (log log N)2/3). The
efficiency of Shor's algorithm is due to the efficiency of the quantum Fourier transform, and modular exponentiation by repeated squarings.
If
a quantum computer with a sufficient number of qubits could operate without succumbing to noise and other quantum decoherence phenomena, Shor's algorithm
could be used to break public-key cryptography schemes such as the widely used RSA scheme. RSA is based on the assumption that factoring large
numbers is computationally infeasible. So far as is known, this assumption is
valid for classical (non-quantum) computers; no classical algorithm is known
that can factor in polynomial time. However, Shor's algorithm shows that
factoring is efficient on an ideal quantum computer, so it may be feasible to
defeat RSA by constructing a large quantum computer. It was also a powerful
motivator for the design and construction of quantum computers and for the
study of new quantum computer algorithms. It has also facilitated research on
new cryptosystems that are secure from quantum computers, collectively called post-quantum cryptography.
In
2001, Shor's algorithm was demonstrated by a group at IBM, who factored 15 into
3 × 5, using an NMR
implementation of a quantum computer with 7 qubits. However, some doubts have been raised as to whether IBM's
experiment was a true demonstration of quantum computation, since no entanglement
was observed. Since IBM's implementation, several other groups have implemented
Shor's algorithm using photonic qubits, emphasizing that entanglement was
observed. In 2012, the factorization of 15 was repeated. Also in 2012, the
factorization of 21 was achieved, setting the record for the largest number
factored with a quantum computer. In April 2012, the factorization of 143 was
achieved, although this used adiabatic quantum computation rather than Shor's algorithm.
Reference
:
http://en.wikipedia.org/wiki/Quantum_gate