https://www.dagstuhl.de/22081

February 20 – 25 , 2022, Dagstuhl Seminar 22081

Theory of Randomized Optimization Heuristics

Organizers

Anne Auger (INRIA Saclay – Palaiseau, FR)
Carlos M. Fonseca (University of Coimbra, PT)
Tobias Friedrich (Hasso-Plattner-Institut, Universität Potsdam, DE)
Johannes Lengler (ETH Zürich, CH)

For support, please contact

Dagstuhl Service Team

Documents

List of Participants
Shared Documents
Dagstuhl Seminar Schedule [pdf]

Motivation

Optimization is at the core of many applications in academia or industry. At the heart of an optimization problem is a search space and one or more objective functions to be optimized. In practical applications, objective functions are often not explicitly known, and therefore cannot be written analytically. Instead, they are available to the optimizer as a black-box. In those situations, exact optimization methods such as (integer) linear programming, semi-definite programming, etc., cannot be applied. Instead, one needs to resort to black-box or gray-box methods. Among those, randomized optimization heuristics (ROHs) are often the methods of choice when objective function gradients are not available, when the objective functions are not convex, are subject to noise or uncertainty, or when one needs to optimize several conflicting objectives. In this last case, one talks about multiobjective optimization, a field where ROHs are successful because they allow the Pareto front to be approximated as a whole in contrast to more classical approaches based on objective aggregation that find a single solution at a time.

The field of ROHs, including evolutionary algorithms (EAs), shares some common ground with the related field of Machine Learning (ML). Learning is underlying many important algorithms such as Estimation-of-Distribution Algorithms (EDAs), Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES), and Linkage-Tree Genetic Algorithms, with several components of those algorithms strongly connected to (or using) machine learning techniques such as hierarchical clustering and expectation maximization algorithms. Also, optimization is an important component of machine learning for training neural networks or for hyperparameter tuning, which is a typical black-box problem.

Theoretical analysis of ROH helps to quantify the performance of different methods on different fitness landscapes and classes of functions in order to improve the understanding of the algorithms, compare them, and get more insights into the influence of different components and parameters. Important progress has been made in the past years to understand more and more complex algorithms that are increasingly closer to those used in practice. Yet, crucial aspects are still missing, such as a better understanding of the dynamics of set-based optimization methods in the context of multiobjective optimization, being able to analyze algorithms that learn associations between variables, both in continuous and discrete domains, as well as the corresponding parameter adaptation mechanisms. Those share some common ground with stochastic gradient descent (SGD) algorithms and will be a focus topic of the Dagstuhl Seminar.

In this context, the overall goal of this Dagstuhl Seminar is to drive research in theory of ROHs towards analyzing more and more complex methods that are closer to state-of-the-art ROHs and, specifically, methods that have some impact outside the Evolutionary Computation domain. In addition to specialists in the theory of ROHs, researchers with expertise in the design of EDAs, adaptive Evolution Strategies and evolutionary multiobjective optimization (EMO) algorithms, but also expert researchers from derivative free optimization, Bayesian optimization, and optimization for machine learning, including adaptive stochastic gradient descent techniques are invited.

Motivation text license
  Creative Commons BY 4.0
  Anne Auger, Carlos M. Fonseca, Tobias Friedrich, and Johannes Lengler

Dagstuhl Seminar Series

Classification

  • Neural And Evolutionary Computing

Keywords

  • Black-box optimization
  • Evolutionary and genetic algorithms
  • Stochastic gradient descent
  • Randomized search algorithms
  • Theoretical computer science
  • Derivative-free optimization

Documentation

In the series Dagstuhl Reports each Dagstuhl Seminar and Dagstuhl Perspectives Workshop is documented. The seminar organizers, in cooperation with the collector, prepare a report that includes contributions from the participants' talks together with a summary of the seminar.

 

Download overview leaflet (PDF).

Dagstuhl's Impact

Please inform us when a publication was published as a result from your seminar. These publications are listed in the category Dagstuhl's Impact and are presented on a special shelf on the ground floor of the library.

Publications

Furthermore, a comprehensive peer-reviewed collection of research papers can be published in the series Dagstuhl Follow-Ups.