https://www.dagstuhl.de/21081

February 21 – 26 , 2021, Dagstuhl Seminar 21081

CANCELLED Challenges in Benchmarking Optimization Heuristics

Due to the Covid-19 pandemic, this seminar was cancelled.

Organizers

Anne Auger (INRIA Saclay – Palaiseau, FR)
Peter A. N. Bosman (CWI – Amsterdam, NL)
Pascal Kerschke (Universität Münster, DE)
Darrell Whitley (Colorado State University – Fort Collins, US)

For support, please contact

Dagstuhl Service Team

Motivation

Benchmarking, i.e., determining how well and/or fast algorithms solve particular classes of problems, is a cornerstone of research on optimization algorithms. This is crucial in helping practitioners choose the best algorithm for a real-world application. Although these applications (e.g., from energy, healthcare, or logistics) pose many challenging optimization problems, algorithmic design research typically evaluates performance on sets of artificial problems. Such problems are often designed from a theoretical perspective to study specific properties of algorithm and problem classes. However, some benchmarks lack connections to real-world problems. Consequently, extrapolating meaningful implications for the performance of these algorithms to real-world applications can be difficult. It is therefore critical to ensure that well-known, easily-accessible and frequently-used benchmarks are unbiased (w.r.t. structural properties), but also to ensure that benchmarks are aligned with real-world applications.

On the other hand, literally thousands of papers are published each year making some claim of having developed a better optimization tool for some application domain. And yet, the overwhelming majority of these enhancements are never adequately tested. Indeed, in many cases, it is highly likely that the only individuals who will execute and evaluate a new algorithm are the authors who created it. This is a source of inefficiency when it comes to how research is executed and evaluated in the field of optimization. This problem can only be resolved by using proper benchmarking.

In this context, the present Dagstuhl Seminar is centered on benchmarking. It brings together a selected list of international experts with different backgrounds and from various application areas (e.g., computer science, machine learning, engineering, statistics, mathematics, operations research, medicine, as well as industrial applications) with the overall objective of

  1. analyzing, comparing and improving the general understanding of the status quo and caveats of benchmarking in different subdomains of optimization and related research fields, and
  2. developing principles for improving our current benchmarks, as well as for designing new, real-world relevant, benchmark problems.

The seminar proposes to address challenges that the current state of benchmarking still faces throughout various areas of optimization-related research. The following (non-exhaustive) list of topics will be at the core of the seminar:

  • the integration and/or inclusion of (ideally, scalable) real-world problems
  • a benchmark’s capability to expose the generality vs. specificity trade-off of optimization heuristics
  • approaches to measure and demonstrate algorithmic performance
  • ways to measure problem characteristics by means of (problem-specific) features
  • novel visualization methods, which improve our understanding of the optimization problem itself and/or the behavior of the optimization heuristic that is operating on the problem

In order to achieve a fruitful and productive collaboration of all participants, this seminar will follow a very interactive format (complemented by a very limited number of invited presentations). The main focus will be on spotlight discussions and breakout sessions, while providing sufficient time for informal discussions.

Motivation text license
  Creative Commons BY 3.0 DE
  Anne Auger, Peter A. N. Bosman, Pascal Kerschke, and Darrell Whitley

Classification

  • Data Structures And Algorithms
  • Neural And Evolutionary Computing
  • Performance

Keywords

  • Benchmarking
  • Optimization
  • Real-world applications
  • Design of search heuristics
  • Understanding problem complexity

Documentation

In the series Dagstuhl Reports each Dagstuhl Seminar and Dagstuhl Perspectives Workshop is documented. The seminar organizers, in cooperation with the collector, prepare a report that includes contributions from the participants' talks together with a summary of the seminar.

 

Download overview leaflet (PDF).

Publications

Furthermore, a comprehensive peer-reviewed collection of research papers can be published in the series Dagstuhl Follow-Ups.

Dagstuhl's Impact

Please inform us when a publication was published as a result from your seminar. These publications are listed in the category Dagstuhl's Impact and are presented on a special shelf on the ground floor of the library.