TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 24271

Theory of Randomized Optimization Heuristics

( 30. Jun – 05. Jul, 2024 )

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/24271

Organisatoren

Kontakt

Motivation

Randomized optimization heuristics (ROHs) are general-purpose solvers that are frequently applied for discrete or continuous optimization when other classes of solvers fail or are not applicable. Typical examples of ROHs include genetic algorithms, evolution strategies, and estimation-of-distribution algorithms (EDAs). Since ROHs are not designed with a proof or analysis in mind, their theoretical analysis is a particularly challenging endeavor. Yet, their practical success, extremely wide applicability, and growing relevance in many domains in industry and academic research push for progress in their understanding. This Dagstuhl Seminar is part of a very successful Dagstuhl Seminar series with the goal of driving this research area forward.

Over the span of more than two decades, the theoretical analysis of ROHs has made significant progress, going from simple settings to evermore complex scenarios. This advance allows us to tackle highly relevant topics that seemed out of reach before. In this seminar, we will focus on three such topics, for which our understanding is still limited, namely, (1) constraint handling, (2) multivariate EDAs, and (3) stochastic approximation and Markov chain stability analysis. In order to approach these problems from multiple angles, we aim to bring together top researchers from inside the ROH community and from related fields. This should result in a vibrant mixture of different expertise, allowing each participant of the seminar to learn about new topics, to participate in insightful discussions, and to potentially start promising collaborations.

Constraint handling is a classic topic in optimization with a theory-practice gap: Many real-world problems have constraints, while many algorithms and nearly all existing theoretical analyses are limited to unconstrained search spaces. In ROHs, the main route to optimization in the presence of constraints is to equip proven algorithms with additional constraint handling mechanisms. This is analogous to turning a gradient-based optimization algorithm into an interior-point solver by means of the barrier method. However, other communities like gradient-based and derivative-free optimization have different approaches to constraint handling.

EDAs constitute an important class of ROHs. In contrast to classical ROHs, which maintain an explicit multi-set of solutions, EDAs evolve a probabilistic model of the search space. Over the last years, EDAs have received growing interest from the theoretical ROH community. Still, to this date, all theoretical insights for EDAs refer to algorithms that build their probabilistic model based on the assumption that the problem variables are independent from each other (so-called univariate models). This is far from how EDAs are applied in practice, where these algorithms are capable of modeling complex (multivariate) interactions among the problem variables.

In the continuous domain, the main challenge is to analyze adaptive algorithms where the distribution for sampling candidate solutions is adapted in each iteration, using advanced mechanisms. In the past few years, tremendous theoretical progress was achieved to analyze step-size adaptive algorithms. Those results were proven using different approaches: finding a Lyapunov function in the state variables of the algorithm that satisfies a drift condition, using an ordinary differential equation (ODE) method, as well as analyzing the stability of underlying Markov chains. This progress was possible thanks to developing new theoretical tools that take into account the specificity of adaptive evolution strategies.

As with each seminar of this series, we are looking forward to exciting exchanges among researchers from the various areas invited, which has been instrumental in the past for substantial progress and unification in our field and beyond. With this foundation, this seminar offers a great opportunity to initiate collaborations connecting ROHs to neighboring fields that could drastically reduce the gap between theory and practice.

Copyright Anne Auger, Tobias Glasmachers, Martin S. Krejca, and Johannes Lengler

Verwandte Seminare
  • Dagstuhl-Seminar 00071: Theory of Evolutionary Algorithms (2000-02-13 - 2000-02-18) (Details)
  • Dagstuhl-Seminar 02031: Theory of Evolutionary Algorithms (2002-01-13 - 2002-01-18) (Details)
  • Dagstuhl-Seminar 04081: Theory of Evolutionary Algorithms (2004-02-15 - 2004-02-20) (Details)
  • Dagstuhl-Seminar 06061: Theory of Evolutionary Algorithms (2006-02-05 - 2006-02-10) (Details)
  • Dagstuhl-Seminar 08051: Theory of Evolutionary Algorithms (2008-01-27 - 2008-02-01) (Details)
  • Dagstuhl-Seminar 10361: Theory of Evolutionary Algorithms (2010-09-05 - 2010-09-10) (Details)
  • Dagstuhl-Seminar 13271: Theory of Evolutionary Algorithms (2013-06-30 - 2013-07-05) (Details)
  • Dagstuhl-Seminar 15211: Theory of Evolutionary Algorithms (2015-05-17 - 2015-05-22) (Details)
  • Dagstuhl-Seminar 17191: Theory of Randomized Optimization Heuristics (2017-05-07 - 2017-05-12) (Details)
  • Dagstuhl-Seminar 19431: Theory of Randomized Optimization Heuristics (2019-10-20 - 2019-10-25) (Details)
  • Dagstuhl-Seminar 22081: Theory of Randomized Optimization Heuristics (2022-02-20 - 2022-02-25) (Details)

Klassifikation
  • Neural and Evolutionary Computing

Schlagworte
  • black-box optimization heuristics
  • evolution strategies
  • genetic and evolutionary algorithms
  • runtime and convergence analysis
  • stochastic processes