TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 26191

Large Language Models for Evolutionary Computation

( 03. May – 08. May, 2026 )

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/26191

Organisatoren
  • Zhaochun Ren (Leiden University, NL)
  • Roman Senkerik (Tomas Bata University in Zlín, CZ)
  • Niki van Stein (Leiden University, NL)
  • Qingfu Zhang (City University of Hong Kong, HK)

Kontakt

Motivation

Large Language Models (LLMs) have recently made extraordinary strides in generative AI, demonstrating remarkable abilities in tasks ranging from code completion to automated content creation. These advances also create exciting possibilities for the field of Evolutionary Computation (EC), where algorithm design and adaptation have traditionally relied on manual engineering of operators such as selection, mutation, and crossover. With LLMs, we now have a novel mechanism to automatically generate, refine, and optimize evolutionary algorithms themselves.

This seminar brings together researchers and practitioners from both Evolutionary Computation and Large Language Models to explore how the two fields can benefit from one another. By focusing on the direction of LLMs for Evolutionary Computation we will delve more deeply into the potential of LLMs to augment, improve or create new metaheuristics. Core topics include:

  1. Algorithm Synthesis and Refinement: Using LLMs to propose variants of evolutionary operators, adapt them to different problem classes, and iteratively improve algorithmic performance. For example, by leveraging evolutionary techniques such as mutation, crossover, and selection to make the LLM-driven search for new metaheuristics as efficient as possible.
  2. Benchmarking and Evaluation: Developing robust metrics and methodologies for assessing LLM-generated algorithms—particularly in ensuring consistent performance, reproducibility, and fair comparisons against established baselines as well as assessing their novelty with respect to the state-of-the-art.
  3. Practical Applications: Highlighting domains such as combinatorial optimization and complex engineering problems where LLM-driven evolutionary methods can yield innovative, real-world solutions. For example, engineering design, automotive crash-worthiness optimization, material science, and many others.
  4. LLMs as EC: Exploring how LLMs can act as optimization engines themselves, leveraging in-context learning and generative capabilities to guide solution sampling and adaptation.
  5. Overview and presentation of available or recently introduced open-source tools and frameworks suitable for iterative refinement, design and benchmarking of LLM-EC algorithms.
  6. Ethical and Regulatory Considerations: Addressing responsible use of generative AI, with special attention to maintaining transparency, fairness, and compliance under evolving guidelines such as the EU Artificial Intelligence Act.

By bringing together experts in LLM technologies, evolutionary optimization, machine learning, and related fields, this Dagstuhl Seminar aims to (1) consolidate the emerging understanding of how best to harness LLMs in EC and other related areas, (2) foster new collaborative projects around LLM-powered algorithm design, and (3) produce actionable guidelines for benchmarking and future research directions. Above all, the seminar will serve as a catalyst for the next generation of optimization methods in which evolutionary techniques and advanced language models combine to tackle some of the most challenging problems in science and industry.

Copyright Zhaochun Ren, Roman Senkerik, Niki van Stein, and Qingfu Zhang

Klassifikation
  • Artificial Intelligence
  • Machine Learning
  • Neural and Evolutionary Computing

Schlagworte
  • large language models
  • automated algorithm design
  • evolutionary computation
  • optimization
  • iterative optimization heuristics