TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 26191

Large Language Models for Evolutionary Computation

( May 03 – May 08, 2026 )

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/26191

Organizers
  • Zhaochun Ren (Leiden University, NL)
  • Roman Senkerik (Tomas Bata University in Zlín, CZ)
  • Niki van Stein (Leiden University, NL)
  • Qingfu Zhang (City University of Hong Kong, HK)

Contact

Motivation

Large Language Models (LLMs) have recently made extraordinary strides in generative AI, demonstrating remarkable abilities in tasks ranging from code completion to automated content creation. These advances also create exciting possibilities for the field of Evolutionary Computation (EC), where algorithm design and adaptation have traditionally relied on manual engineering of operators such as selection, mutation, and crossover. With LLMs, we now have a novel mechanism to automatically generate, refine, and optimize evolutionary algorithms themselves.

This seminar brings together researchers and practitioners from both Evolutionary Computation and Large Language Models to explore how the two fields can benefit from one another. By focusing on the direction of LLMs for Evolutionary Computation we will delve more deeply into the potential of LLMs to augment, improve or create new metaheuristics. Core topics include:

  1. Algorithm Synthesis and Refinement: Using LLMs to propose variants of evolutionary operators, adapt them to different problem classes, and iteratively improve algorithmic performance. For example, by leveraging evolutionary techniques such as mutation, crossover, and selection to make the LLM-driven search for new metaheuristics as efficient as possible.
  2. Benchmarking and Evaluation: Developing robust metrics and methodologies for assessing LLM-generated algorithms—particularly in ensuring consistent performance, reproducibility, and fair comparisons against established baselines as well as assessing their novelty with respect to the state-of-the-art.
  3. Practical Applications: Highlighting domains such as combinatorial optimization and complex engineering problems where LLM-driven evolutionary methods can yield innovative, real-world solutions. For example, engineering design, automotive crash-worthiness optimization, material science, and many others.
  4. LLMs as EC: Exploring how LLMs can act as optimization engines themselves, leveraging in-context learning and generative capabilities to guide solution sampling and adaptation.
  5. Overview and presentation of available or recently introduced open-source tools and frameworks suitable for iterative refinement, design and benchmarking of LLM-EC algorithms.
  6. Ethical and Regulatory Considerations: Addressing responsible use of generative AI, with special attention to maintaining transparency, fairness, and compliance under evolving guidelines such as the EU Artificial Intelligence Act.

By bringing together experts in LLM technologies, evolutionary optimization, machine learning, and related fields, this Dagstuhl Seminar aims to (1) consolidate the emerging understanding of how best to harness LLMs in EC and other related areas, (2) foster new collaborative projects around LLM-powered algorithm design, and (3) produce actionable guidelines for benchmarking and future research directions. Above all, the seminar will serve as a catalyst for the next generation of optimization methods in which evolutionary techniques and advanced language models combine to tackle some of the most challenging problems in science and industry.

Copyright Zhaochun Ren, Roman Senkerik, Niki van Stein, and Qingfu Zhang

Classification
  • Artificial Intelligence
  • Machine Learning
  • Neural and Evolutionary Computing

Keywords
  • large language models
  • automated algorithm design
  • evolutionary computation
  • optimization
  • iterative optimization heuristics