11. – 15. Mai 1998, Dagstuhl Seminar 98191
T. Beth (Karlsruhe), G. Brassard (Montreal)
Auskunft zu diesem Dagstuhl Seminar erteilt
Quantum algorithms – a new topic in both informatics and physics has become a central theme of one of the most challenging areas of interdisciplinary research of modern science. This seemingly esoteric area of research which has been stimulated by Feynman in 1980 and Benioff, who was the first to suggest quantum-mechanical evolution for computation in 1982, has become a serious challenge after Shor’s publication of the “Algorithms for quantum computation: Discrete logarithms and factoring” in 1994. This breakthrough in theoretical computer science has stimulated a new field of physics, especially on the experimental side, namely the investigation of controlled quantum systems which is motivated by the promise that algorithmic processing of quantum information may provide an exponential gain in speed and space over classical computers. As a consequence of this joint research there has been a surprising progress during the last years towards new algorithms and especially complexity bounds for certain problems. However, up to today there are no exact theorems available on the relation between quantum complexity classes and classical complexity classes.
Quantum algorithms rely on three effects:
If several qubits are combined in a quantum register by the laws of quantum mechanics, the state space of such a processor allows the handling of exponentially many data by superposition of entangled states. Owing to the linearity of quantum mechanics each operation of so-called quantum gates will act on all states simultaneously which have non-zero population in the superposition. This phenomenon is the basis for quantum parallelism which leads to a completely new model of computation: While a classical probabilistic algorithm can be well described through the tree of all possible computations weighed with the respective probabilities, the sum of probabilities of all positive results are added to give the total probability of a successful computation, the use of quantum states based on complex amplitudes instead of probabilistic weights will allow the enhancement or deletion of amplitudes. This, in a nutshell, is the essential advantage of quantum algorithms. Each desirable computational path can be designed to “absorbe” the probability amplitude on the account of the amplitudes of other paths. This principle is known from physics as constructive and destructive interference.
The Dagstuhl Seminar 98191 has brought together scientists from computer science, mathematics, theoretical physics, and experimental physics to discuss the most recent developments of this very new and possibly revolutionizing concept of modern computing. As opposed to most “classical” computer science conferences, the unusual concept of this conference can be described by the fact that the theoretical aspects of algorithm design are, at least at the present state of the field, not to be seen device-independent: In other words, in contrast to classical computer science approaches to algorithms and their application, where the actual physical realization on the bit level is not taken into account, it is the feature of this field that in quantum algorithms the physical realization is intrinsically connected with the design of algorithms: Thus, everyone working in the area has to be relatively well acquainted with both, the computer science sides and the physics sides where the modeling of both physics and computer science on the basis of quantum theory and their applications relies on rather strong mathematical foundations such as Hilbert space theory, group theory, combinatorics, information theory, coding theory, and signal processing.