23. – 28. September 2007, Dagstuhl Seminar 07391
Probabilistic Methods in the Design and Analysis of Algorithms
Auskunft zu diesem Dagstuhl Seminar erteilt
It is difficult to overstate the importance of probabilistic methods in Theoretical Computer Science. They belong to the most powerful and widely used tools, for example in designing efficient randomized algorithms for tackling hard optimization problems; in establishing various lower bounds in complexity theory; in the proofs of many useful discrete properties in extremal combinatorics; in providing frameworks such as the average-case and smoothed analysis for measuring the performance of algorithms; in the theory of interactive proofs. The body of work using probabilistic methods has experienced an impressive growth in the recent years. The following topics attracted enormous attention both from theorists as well as practitioners during the recent years.
In the area of randomized algorithms, several new probabilistic techniques were developed. For example, there are several exciting recent developments in the probabilistic metric embedding with tree metrics. Because various optimization problems can be solved optimally on trees (e.g., by the dynamic programming approach), quality approximations of arbitrary metrics by tree metrics provide a systematic approach for designing approximation algorithms for general metrics. Further, new techniques for designing randomized data structures were developed that draw on methods from the theory of random graphs and random walks in graphs. A core issue here is the efficient simulation of high-degree randomness without the assumption of the inputs being random.
Probabilistic methods have played an active role in developing analysis frameworks that provide ``practical enough'' measures, yet one can still conduct rigorous analyses using these frameworks. For example, the recently developed smoothed analysis uses small random perturbations for defining performance measures. This framework applies to algorithms whose inputs are subject to slight random noises. The smoothed complexity of an algorithm is then the maximum over its inputs of the expected running time of the algorithm under slight perturbations of that input. Smoothed complexity is measured in terms of the size of the input and the magnitude of the perturbation.
Another area in which random inputs play an important role is stochastic optimization. Here uncertainty in the data is modeled by probability distributions. Stochastic optimization has a wide range of applications in various areas, including logistics, transportation, financial instruments, and network design. In recent years, there has been significant progress in analyzing important algorithms and heuristics used in this field. For example, the sample average approximation (SAA) method solves stochastic programs by sampling from the distribution of input scenarios. Recent theoretical results show that the SAA method has the properties of a fully randomized approximation scheme for a large class of multistage stochastic optimization problems.
The workshop covered recent progress in randomized algorithms and probabilistic measures of algorithms including the smoothed analysis, average-case analysis, semi-random analysis, and stochastic optimization. The presentations covered a large range of optimization problems such as linear programming, integer programming, random games, computational geometry, and scheduling. The most important contribution of the seminar is the exchange of new ideas between researchers using probabilistic methods in different contexts. In addition of providing an opportunity for information sharing and collaborations, the workshop exposed young researchers, students, and postdocs to recent developments and outstanding issues in probabilistic methods.
Related Dagstuhl Seminar
- Algorithms/Probabilistic Methods/Analysis/Complexity