TOP
##### Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
##### Seminars
###### External resources:
• DOOR (for registering your stay at Dagstuhl)
• DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
##### dblp
###### External resources:
• the dblp Computer Science Bibliography

### Parameterized complexity and approximation algorithms

##### Organizers
• (MIT - Cambridge, US)
• (AT&T Labs Research - Florham Park, US)
• (Tel Aviv University, IL)

##### Summary

Many of the computational problems that arise in practice are optimization problems: the task is to find a solution where the cost, quality, size, profit, or some other measure is as large or small as possible. The NP-hardness of an optimization problem implies that, unless P = NP, there is no polynomial-time algorithm that finds the exact value of the optimum. Various approaches have been proposed in the literature to cope with NP-hard problems.When designing approximation algorithms, we relax the requirement that the algorithm produces an optimum solution, and our aim is to devise a polynomial-time algorithm such that the solution it produces is not necessarily optimal, but there is some worst-case bound on the solution quality.

In parameterized complexity the running time is analyzed in finer detail: instead of expressing it as a function of the input size, one or more parameters of the input instance are defined, and we investigate the effect of these parameters on the running time. The goal is to design algorithms that work efficiently if the parameters of the input instance are small (even if the size of the input is large). More precisely, we say that a problem is fixed-parameter tractable (FPT) with parameter k if the problem can be solved in time f(k) • nc for some function f and constant c. That is, our goal is to design algorithms that are polynomial in n and exponential only in the parameter k. The motivation behind this definition is that in practice we do not have to be able to solve the problem for any possible input: we might be able to define some parameter k that is typically “small” for the instances we encounter.

Until very recently, approximation algorithms and parameterized complexity have been considered to be two different approaches that have very little to do with each other. Indeed, the methodology of the two fields are very different: the design of approximation algorithms has its own distinctive set of tools such as linear programming, greedy algorithms, probabilistic methods, while parameterized complexity uses a different set of techniques: kernelization, bounded search trees, and extremal combinatorics. However, in the past few years, several connections between the two fields were identified that are worth investigating.

During the 4 days of the conference, 23 talks were given by the participants. Five of these talks were 60-minute surveys on various topics: Dániel Marx talked about several existing connections between approximation algorithms and fixed-parameter algorithms; Gregory Gutin talked about the rapidly growing area of parameterization above guaranteed values; Erik Demaine talked about the recent area of bidimensionality relevant to both approximation and fixed-parameter algorithms; Guy Kortsarz talked about relevant problems in wireless network design; and Daniel Lokshtanov talked about lower bounds on kernelization. As an additional highlight of the seminar, Holger Dell presented in detail his exciting new result about sparsification and its applications to kernel lower bounds.

It is becoming increasingly clear that kernelization---both upper bounds and lower bounds---are becoming a central focus of fixed-parameter algorithms. The talks of Lokshtanov and Dell have shown that kernelization lower bounds are a rich topic with much deep work to be done, and the talks of Demaine and Bodlaender have shown that ``metakernelization'' results are possible for wide ranges of problems. It is expected that in the next few years there will be substantial further progress on the topic of kernelization.

The seminar successfully brought together both experts and newcomers from the two fields of approximation algorithms and fixed-parameter algorithms, with many interesting interactions. The talks left plenty of time for discussion in the afternoon. An open problem session was held on Monday, and problems raised there were discussed by different groups throughout the seminar.

##### Participants
• Faisal N. Abu-Khzam (Lebanese American University, LB) [dblp]
• Isolde Adler (Goethe-Universität - Frankfurt a. M., DE) [dblp]
• Hans L. Bodlaender (Utrecht University, NL) [dblp]
• Sergio Cabello (University of Ljubljana, SI) [dblp]
• Yijia Chen (Shanghai Jiao Tong University, CN) [dblp]
• Frank Dehne (Carleton University - Ottawa, CA) [dblp]
• Holger Dell (HU Berlin, DE) [dblp]
• Erik D. Demaine (MIT - Cambridge, US) [dblp]
• Frederic Dorn (University of Bergen, NO) [dblp]
• Michael R. Fellows (University of Newcastle, AU) [dblp]
• Henning Fernau (Universität Trier, DE) [dblp]
• Fedor V. Fomin (University of Bergen, NO) [dblp]
• Gregory Z. Gutin (Royal Holloway University of London, GB) [dblp]
• MohammadTaghi Hajiaghayi (AT&T Labs Research - Florham Park, US) [dblp]
• Danny Hermelin (University of Haifa, IL) [dblp]
• Xiuzhen Huang (Arkansas State University - Jonesboro, US)
• Klaus Jansen (Universität Kiel, DE) [dblp]
• Iyad A. Kanj (DePaul University - Chicago, US) [dblp]
• Guy Kortsarz (Rutgers University - Camden, US) [dblp]
• Stefan Kratsch (MPI für Informatik - Saarbrücken, DE) [dblp]
• Stephan Kreutzer (University of Oxford, GB) [dblp]
• Andrei Krokhin (Durham University, GB) [dblp]
• Michael A. Langston (University of Tennessee, US)
• Daniel Lokshtanov (University of Bergen, NO) [dblp]
• Dániel Marx (Tel Aviv University, IL) [dblp]
• Julian Mestre (MPI für Informatik - Saarbrücken, DE)
• Tobias Mömke (ETH Zürich, CH)
• Rolf Niedermeier (Universität Jena, DE) [dblp]
• Sang-il Oum (KAIST - Daejeon, KR) [dblp]
• Mihai Patrascu (AT&T Labs Research - Florham Park, US)
• Christophe Paul (University of Montpellier 2, FR) [dblp]
• Geevarghese Philip (The Institute of Mathematical Sciences, IN) [dblp]
• Frances A. Rosamond (University of Newcastle, AU) [dblp]
• Peter Rossmanith (RWTH Aachen, DE) [dblp]
• Saket Saurabh (University of Bergen, NO) [dblp]
• Ildikó Schlotter (Budapest University of Technology & Economics, HU) [dblp]
• Ulrike Stege (University of Victoria, CA) [dblp]
• Monika Steinova (ETH Zürich, CH)
• Kunal Talwar (Microsoft Corp. - Mountain View, US)
• Siamak Tazari (HU Berlin, DE) [dblp]
• Dimitrios M. Thilikos (University of Athens, GR) [dblp]
• Erik Jan van Leeuwen (TU Eindhoven, NL) [dblp]
• Magnus Wahlström (MPI für Informatik - Saarbrücken, DE) [dblp]
• Gerhard J. Woeginger (TU Eindhoven, NL) [dblp]

##### Related Seminars
• Dagstuhl Seminar 01311: Parameterized Complexity (2001-07-29 - 2001-08-03) (Details)

##### Classification
• data structures
• algorithms
• complexity

##### Keywords
• approximation algorithms
• parameterized complexity
• fixed-parameter tractability