TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 17141

Probabilistic Methods in the Design and Analysis of Algorithms

( Apr 02 – Apr 07, 2017 )

(Click in the middle of the image to enlarge)

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/17141

Organizers

Contact


Schedule

Motivation

Probabilistic methods play a central role in theoretical computer science. They are a powerful and widely applied tool used, for example, for designing efficient randomized algorithms and for establishing various lower bounds in complexity theory. They also form the basis of frameworks such as average-case and smoothed analysis, in which algorithms are analyzed beyond the classical worst-case perspective. The goal of this seminar is to cover recent progress in the context of probabilistic methods and to bring together researchers working in various areas of algorithms and probabilistic methods.

Probabilistic methods are often used in algorithm analysis when worst-case analysis does not provide useful or empirically accurate results. For example, worst-case analysis suggests that the simplex method is an exponential-time algorithm for linear programming, while in fact it runs in near-linear time on almost all inputs of interest. For the simplex method and many other algorithms worst-case inputs are often rather contrived and occur hardly ever in practical applications. The last decade has seen much interest in the development of a more realistic and robust algorithmic theory that is not entirely based on worst-case performance. One very successful line of research studies the performance of algorithms on inputs that are to some extent random. Besides average-case analysis, in which inputs are generated randomly according to some fixed distribution, also more sophisticated semi-random models have gained momentum.

Another area in which probabilistic methods play a central role is stochastic optimization. Here uncertainty in the data is modeled by probability distributions and the actual data is only revealed over time. For example, in a scheduling problem one might know the probability distribution of a job's length but one learns its actual length only by executing it.

Probabilistic methods are also central in algorithm design. For many optimization problems, the most efficient known algorithms rely essentially on randomization. In other areas, like sublinear algorithms and hashing, one can even prove that randomization is necessary to obtain good algorithms.

Copyright Bodo Manthey, Claire Mathieu, Heiko Röglin, and Eli Upfal

Summary

Probabilistic methods play a central role in theoretical computer science. They are a powerful and widely applied tool used, for example, for designing efficient randomized algorithms and for establishing various lower bounds in complexity theory. They also form the basis of frameworks like average-case and smoothed analysis, in which algorithms are analyzed beyond the classical worst-case perspective. The seminar was on probabilistic methods with a focus on the design and analysis of algorithms.

Probabilistic methods are often used in algorithm analysis when worst-case analysis does not provide useful or empirically accurate results. For example, worst-case analysis suggests that the simplex method is an exponential-time algorithm for linear programming, while in fact it runs in near-linear time on almost all inputs of interest. For the simplex method and many other algorithms worst-case inputs are often rather contrived and occur hardly ever in practical applications. The last decade has seen much interest in the development of a more realistic and robust algorithmic theory that is not entirely based on worst-case performance. One very successful line of research studies the performance of algorithms on inputs that are to some extent random. Besides average-case analysis, in which inputs are generated randomly according to some fixed distribution, also more sophisticated semi-random models have gained momentum.

Another area in which probabilistic methods play a central role is stochastic optimization. Here uncertainty in the data is modeled by probability distributions and the actual data is only revealed over time. For example, in a scheduling problem one might know the probability distribution of a job's length but one learns its actual length only by executing it.

Probabilistic methods are also central in algorithm design. For many optimization problems, the most efficient known algorithms rely essentially on randomization. In other areas, like sublinear algorithms and hashing, one can even prove that randomization is necessary to obtain good algorithms.

The seminar covered recent progress in the context of probabilistic methods.

Copyright Bodo Manthey, Claire Mathieu, Heiko Röglin, and Eli Upfal

Participants
  • Dimitris Achlioptas (University of California - Santa Cruz, US) [dblp]
  • Aris Anagnostopoulos (Sapienza University of Rome, IT) [dblp]
  • Luca Becchetti (Sapienza University of Rome, IT) [dblp]
  • Petra Berenbrink (Universität Hamburg, DE) [dblp]
  • Amin Coja-Oghlan (Goethe-Universität - Frankfurt am Main, DE) [dblp]
  • Artur Czumaj (University of Warwick - Coventry, GB) [dblp]
  • Daniel Dadush (CWI - Amsterdam, NL) [dblp]
  • Martin Dietzfelbinger (TU Ilmenau, DE) [dblp]
  • Benjamin Doerr (Ecole Polytechnique - Palaiseau, FR) [dblp]
  • Devdatt Dubhashi (Chalmers UT - Göteborg, SE) [dblp]
  • Matthias Englert (University of Warwick - Coventry, GB) [dblp]
  • Leslie Ann Goldberg (University of Oxford, GB) [dblp]
  • Paul W. Goldberg (University of Oxford, GB) [dblp]
  • Ross Kang (Radboud University Nijmegen, NL) [dblp]
  • Thomas Kesselheim (TU Dortmund, DE) [dblp]
  • Stefan Klootwijk (University of Twente, NL) [dblp]
  • Robert Krauthgamer (Weizmann Institute - Rehovot, IL) [dblp]
  • Marvin Künnemann (MPI für Informatik - Saarbrücken, DE) [dblp]
  • Stefano Leonardi (Sapienza University of Rome, IT) [dblp]
  • Frederik Mallmann-Trenn (ENS - Paris, FR) [dblp]
  • Bodo Manthey (University of Twente, NL) [dblp]
  • Claire Mathieu (ENS - Paris, FR) [dblp]
  • Nicole Megow (Universität Bremen, DE) [dblp]
  • Morteza Monemizadeh (Charles University - Prague, CZ) [dblp]
  • Seffi Naor (Technion - Haifa, IL) [dblp]
  • Alessandro Panconesi (Sapienza University of Rome, IT) [dblp]
  • Andrea Pietracaprina (University of Padova, IT) [dblp]
  • Kirk Pruhs (University of Pittsburgh, US) [dblp]
  • Geppino Pucci (University of Padova, IT) [dblp]
  • Matteo Riondato (Two Sigma Investments LP - New York, US) [dblp]
  • Heiko Röglin (Universität Bonn, DE) [dblp]
  • Thomas Sauerwald (University of Cambridge, GB) [dblp]
  • Melanie Schmidt (Universität Bonn, DE) [dblp]
  • Christian Sohler (TU Dortmund, DE) [dblp]
  • Aravind Srinivasan (University of Maryland - College Park, US) [dblp]
  • Marc Uetz (University of Twente, NL) [dblp]
  • Eli Upfal (Brown University - Providence, US) [dblp]
  • Tjark Vredeveld (Maastricht University, NL) [dblp]
  • Philipp Woelfel (University of Calgary, CA) [dblp]

Related Seminars
  • Dagstuhl Seminar 07391: Probabilistic Methods in the Design and Analysis of Algorithms (2007-09-23 - 2007-09-28) (Details)

Classification
  • data structures / algorithms / complexity

Keywords
  • analysis of algorithms
  • randomized algorithms
  • sub-linear algorithms
  • average-case analysis
  • smoothed analysis
  • random graphs