April 27 – May 2 , 2003, Dagstuhl Seminar 03181
Centennial Seminar on Kolmogorov Complexity and Applications
For support, please contact
Algorithmic information theory (Kolmogorov complexity theory) measures the amount of information in a given finite object (bit string, file, message etc.) and formalizes the distinction between highly compressible objects that contain little information (regular objects) and incompressible objects with high information content (random objects). This idea was put forward in 1960's by several researchers, including the famous mathematician, Andrei Nikolaevich Kolmogorov, and led to a fruitful developments. The seminar celebrating 100th birthday anniversary of Kolmogorov, tried to gather the most active people in the field, including some disciples of Kolmogorov, for discussion.
Several active fields of research were covered in the talks:
Relations between computational complexity and descriptional complexity. The idea of taking into account the computation time (needed for decompression) was clear already in 1960's. However, only recently this connection became better understood and interesting relations between complexity classes and time-limited random (incompressible) objects were found. This development could be seen also as finding connections between different notions of randomness (randomness in algorithmic information theory, pseudo-random number generators etc.)
Starting with classical works of Martin-Löf, the notion of algorithmic randomness was closely related to measure theory. Recently it was noted that classical notion of Hausdorff dimension (and similar notions) could be naturally translated to the algorithmic information theory using martingale technique and similar notions.
The first Kolmogorov paper on the subject was called "Three approaches to the definition of the notion of "amount of information", and these approaches were named `combinatorial', `probabilistic' and `algorithmic'. Recently some formal links between these three approaches were noted that allow us to translate some results of algorithmic information theory into combinatorial results and statements about Shannon entropy.
Last but not least there has been a recent development clarifying the distinction between "accidental" information (random noise) and "meaningful information", and how to separate the two. This is a central object of statistics and model selection.
Algorithmic information theory belongs to theoretical computer science and does not claim to be immediately applicable to practice (for example, there is no algorithm to compute Kolmogorov complexity of a given string). However, its ideas act as a sort of inspiration for quite practical applications in learning theory, pattern recognition etc. showing that deep theoretical research becomes useful unexpectedly often.
A comprehensive as well as introductory treatment of Kolmogorov complexity is the monograph by Ming Li and Paul Vitanyi "An Introduction to Kolmogorov Complexity and Its Applications" , Springer-Verlag, New York, 2nd Edition 1997.
Related Dagstuhl Seminar
- 06051: "Kolmogorov Complexity and Applications " (2006)
- Kolmogorov complexity
- Information theory
- Computational complexity