https://www.dagstuhl.de/22081
20. – 25. Februar 2022, Dagstuhl-Seminar 22081
Theory of Randomized Optimization Heuristics
Organisatoren
Anne Auger (INRIA Saclay – Palaiseau, FR)
Carlos M. Fonseca (University of Coimbra, PT)
Tobias Friedrich (Hasso-Plattner-Institut, Universität Potsdam, DE)
Johannes Lengler (ETH Zürich, CH)
Auskunft zu diesem Dagstuhl-Seminar erteilen
Simone Schilke zu administrativen Fragen
Andreas Dolzmann zu wissenschaftlichen Fragen
Dokumente
Teilnehmerliste
Gemeinsame Dokumente
Programm des Dagstuhl-Seminars [pdf]
Motivation
Optimization is at the core of many applications in academia or industry. At the heart of an optimization problem is a search space and one or more objective functions to be optimized. In practical applications, objective functions are often not explicitly known, and therefore cannot be written analytically. Instead, they are available to the optimizer as a black-box. In those situations, exact optimization methods such as (integer) linear programming, semi-definite programming, etc., cannot be applied. Instead, one needs to resort to black-box or gray-box methods. Among those, randomized optimization heuristics (ROHs) are often the methods of choice when objective function gradients are not available, when the objective functions are not convex, are subject to noise or uncertainty, or when one needs to optimize several conflicting objectives. In this last case, one talks about multiobjective optimization, a field where ROHs are successful because they allow the Pareto front to be approximated as a whole in contrast to more classical approaches based on objective aggregation that find a single solution at a time.
The field of ROHs, including evolutionary algorithms (EAs), shares some common ground with the related field of Machine Learning (ML). Learning is underlying many important algorithms such as Estimation-of-Distribution Algorithms (EDAs), Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES), and Linkage-Tree Genetic Algorithms, with several components of those algorithms strongly connected to (or using) machine learning techniques such as hierarchical clustering and expectation maximization algorithms. Also, optimization is an important component of machine learning for training neural networks or for hyperparameter tuning, which is a typical black-box problem.
Theoretical analysis of ROH helps to quantify the performance of different methods on different fitness landscapes and classes of functions in order to improve the understanding of the algorithms, compare them, and get more insights into the influence of different components and parameters. Important progress has been made in the past years to understand more and more complex algorithms that are increasingly closer to those used in practice. Yet, crucial aspects are still missing, such as a better understanding of the dynamics of set-based optimization methods in the context of multiobjective optimization, being able to analyze algorithms that learn associations between variables, both in continuous and discrete domains, as well as the corresponding parameter adaptation mechanisms. Those share some common ground with stochastic gradient descent (SGD) algorithms and will be a focus topic of the Dagstuhl Seminar.
In this context, the overall goal of this Dagstuhl Seminar is to drive research in theory of ROHs towards analyzing more and more complex methods that are closer to state-of-the-art ROHs and, specifically, methods that have some impact outside the Evolutionary Computation domain. In addition to specialists in the theory of ROHs, researchers with expertise in the design of EDAs, adaptive Evolution Strategies and evolutionary multiobjective optimization (EMO) algorithms, but also expert researchers from derivative free optimization, Bayesian optimization, and optimization for machine learning, including adaptive stochastic gradient descent techniques are invited.
Motivation text license Creative Commons BY 4.0
Anne Auger, Carlos M. Fonseca, Tobias Friedrich, and Johannes Lengler
Dagstuhl-Seminar Series
- 19431: "Theory of Randomized Optimization Heuristics" (2019)
- 17191: "Theory of Randomized Optimization Heuristics" (2017)
- 15211: "Theory of Evolutionary Algorithms" (2015)
- 13271: "Theory of Evolutionary Algorithms" (2013)
- 10361: "Theory of Evolutionary Algorithms" (2010)
- 08051: "Theory of Evolutionary Algorithms" (2008)
- 06061: "Theory of Evolutionary Algorithms " (2006)
- 04081: "Theory of Evolutionary Algorithms" (2004)
- 02031: "Theory of Evolutionary Algorithms" (2002)
- 00071: "Theory of Evolutionary Algorithms" (2000)
Classification
- Neural And Evolutionary Computing
Keywords
- Black-box optimization
- Evolutionary and genetic algorithms
- Stochastic gradient descent
- Randomized search algorithms
- Theoretical computer science
- Derivative-free optimization