TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 21081

Challenges in Benchmarking Optimization Heuristics Cancelled

( Feb 21 – Feb 26, 2021 )

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/21081

Replacement
Dagstuhl Seminar 23251: Challenges in Benchmarking Optimization Heuristics (2023-06-18 - 2023-06-23) (Details)

Organizers

Contact

Motivation

Benchmarking, i.e., determining how well and/or fast algorithms solve particular classes of problems, is a cornerstone of research on optimization algorithms. This is crucial in helping practitioners choose the best algorithm for a real-world application. Although these applications (e.g., from energy, healthcare, or logistics) pose many challenging optimization problems, algorithmic design research typically evaluates performance on sets of artificial problems. Such problems are often designed from a theoretical perspective to study specific properties of algorithm and problem classes. However, some benchmarks lack connections to real-world problems. Consequently, extrapolating meaningful implications for the performance of these algorithms to real-world applications can be difficult. It is therefore critical to ensure that well-known, easily-accessible and frequently-used benchmarks are unbiased (w.r.t. structural properties), but also to ensure that benchmarks are aligned with real-world applications.

On the other hand, literally thousands of papers are published each year making some claim of having developed a better optimization tool for some application domain. And yet, the overwhelming majority of these enhancements are never adequately tested. Indeed, in many cases, it is highly likely that the only individuals who will execute and evaluate a new algorithm are the authors who created it. This is a source of inefficiency when it comes to how research is executed and evaluated in the field of optimization. This problem can only be resolved by using proper benchmarking.

In this context, the present Dagstuhl Seminar is centered on benchmarking. It brings together a selected list of international experts with different backgrounds and from various application areas (e.g., computer science, machine learning, engineering, statistics, mathematics, operations research, medicine, as well as industrial applications) with the overall objective of

  1. analyzing, comparing and improving the general understanding of the status quo and caveats of benchmarking in different subdomains of optimization and related research fields, and
  2. developing principles for improving our current benchmarks, as well as for designing new, real-world relevant, benchmark problems.

The seminar proposes to address challenges that the current state of benchmarking still faces throughout various areas of optimization-related research. The following (non-exhaustive) list of topics will be at the core of the seminar:

  • the integration and/or inclusion of (ideally, scalable) real-world problems
  • a benchmark’s capability to expose the generality vs. specificity trade-off of optimization heuristics
  • approaches to measure and demonstrate algorithmic performance
  • ways to measure problem characteristics by means of (problem-specific) features
  • novel visualization methods, which improve our understanding of the optimization problem itself and/or the behavior of the optimization heuristic that is operating on the problem

In order to achieve a fruitful and productive collaboration of all participants, this seminar will follow a very interactive format (complemented by a very limited number of invited presentations). The main focus will be on spotlight discussions and breakout sessions, while providing sufficient time for informal discussions.

Copyright Anne Auger, Peter A. N. Bosman, Pascal Kerschke, and Darrell Whitley

Participants
  • Anne Auger (INRIA Saclay - Palaiseau, FR) [dblp]
  • Peter A. N. Bosman (CWI - Amsterdam, NL) [dblp]
  • Pascal Kerschke (Universität Münster, DE) [dblp]
  • Darrell Whitley (Colorado State University - Fort Collins, US) [dblp]

Classification
  • Data Structures and Algorithms
  • Neural and Evolutionary Computing
  • Performance

Keywords
  • benchmarking
  • optimization
  • real-world applications
  • design of search heuristics
  • understanding problem complexity