02. – 07. April 2017, Dagstuhl-Seminar 17141

Probabilistic Methods in the Design and Analysis of Algorithms


Bodo Manthey (University of Twente, NL)
Claire Mathieu (ENS – Paris, FR)
Heiko Röglin (Universität Bonn, DE)
Eli Upfal (Brown University – Providence, US)

Auskunft zu diesem Dagstuhl-Seminar erteilt

Dagstuhl Service Team


Dagstuhl Report, Volume 7, Issue 4 Dagstuhl Report
Programm des Dagstuhl-Seminars [pdf]


Probabilistic methods play a central role in theoretical computer science. They are a powerful and widely applied tool used, for example, for designing efficient randomized algorithms and for establishing various lower bounds in complexity theory. They also form the basis of frameworks like average-case and smoothed analysis, in which algorithms are analyzed beyond the classical worst-case perspective. The seminar was on probabilistic methods with a focus on the design and analysis of algorithms.

Probabilistic methods are often used in algorithm analysis when worst-case analysis does not provide useful or empirically accurate results. For example, worst-case analysis suggests that the simplex method is an exponential-time algorithm for linear programming, while in fact it runs in near-linear time on almost all inputs of interest. For the simplex method and many other algorithms worst-case inputs are often rather contrived and occur hardly ever in practical applications. The last decade has seen much interest in the development of a more realistic and robust algorithmic theory that is not entirely based on worst-case performance. One very successful line of research studies the performance of algorithms on inputs that are to some extent random. Besides average-case analysis, in which inputs are generated randomly according to some fixed distribution, also more sophisticated semi-random models have gained momentum.

Another area in which probabilistic methods play a central role is stochastic optimization. Here uncertainty in the data is modeled by probability distributions and the actual data is only revealed over time. For example, in a scheduling problem one might know the probability distribution of a job's length but one learns its actual length only by executing it.

Probabilistic methods are also central in algorithm design. For many optimization problems, the most efficient known algorithms rely essentially on randomization. In other areas, like sublinear algorithms and hashing, one can even prove that randomization is necessary to obtain good algorithms.

The seminar covered recent progress in the context of probabilistic methods.

Summary text license
  Creative Commons BY 3.0 Unported license
  Bodo Manthey, Claire Mathieu, Heiko Röglin, and Eli Upfal

Related Dagstuhl-Seminar


  • Data Structures / Algorithms / Complexity


  • Analysis of algorithms
  • Randomized algorithms
  • Sub-linear algorithms
  • Average-case analysis
  • Smoothed analysis
  • Random graphs


In der Reihe Dagstuhl Reports werden alle Dagstuhl-Seminare und Dagstuhl-Perspektiven-Workshops dokumentiert. Die Organisatoren stellen zusammen mit dem Collector des Seminars einen Bericht zusammen, der die Beiträge der Autoren zusammenfasst und um eine Zusammenfassung ergänzt.


Download Übersichtsflyer (PDF).

Dagstuhl's Impact

Bitte informieren Sie uns, wenn eine Veröffentlichung ausgehend von Ihrem Seminar entsteht. Derartige Veröffentlichungen werden von uns in der Rubrik Dagstuhl's Impact separat aufgelistet  und im Erdgeschoss der Bibliothek präsentiert.


Es besteht weiterhin die Möglichkeit, eine umfassende Kollektion begutachteter Arbeiten in der Reihe Dagstuhl Follow-Ups zu publizieren.