Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Within this website:
External resources:
Within this website:
External resources:
  • the dblp Computer Science Bibliography

Dagstuhl Seminar 18112

Coding Theory for Inference, Learning and Optimization

( Mar 11 – Mar 16, 2018 )

(Click in the middle of the image to enlarge)

Please use the following short url to reference this page:




Coding theory has recently found new applications in areas such as distributed machine learning, dimension reduction, and variety of statistical problems involving estimation and inference. In machine learning applications that use large-scale data, it is desirable to communicate the results of distributed computations in an efficient and robust manner. In dimension reduction applications, the pseudorandom properties of algebraic codes may be used to construct projection matrices that are deterministic and facilitate algorithmic efficiency. Finally, relationships that have been forged between coding theory and problems in theoretical computer science, such as k-SAT or the planted clique problem, lead to a new interpretation of the sharp thresholds encountered in these settings in terms of thresholds in channel coding theory.

The aim of this Dagstuhl Seminar is to draw together researchers from industry and academia that are working in coding theory, particularly in these different (and somewhat disparate) application areas of machine learning and inference. The discussions and collaborations facilitated by this seminar will help spark new ideas about how coding theory may be used to improve and inform modern techniques for data analytics.

Copyright Po-Ling Loh, Arya Mazumdar, Dimitris Papailiopoulos, and Rüdiger Urbanke


Codes are widely used in engineering applications to offer reliability and fault tolerance. The high-level idea of coding is to exploit redundancy in order to create robustness against system noise. The theoretical properties of codes have been studied for decades both from a purely mathematical point of view, as well as in various engineering contexts. The latter have resulted in constructions that have been incorporated into our daily lives: No storage device, cell phone transmission, or Wi-Fi connection would be possible without well-constructed codes.

Recent research has connected concepts in coding theory to non-traditional applications in learning, computation and inference, where codes have been used to design more efficient inference algorithms and build robust, large-scale, distributed computational pipelines. Moreover, ideas derived from Shannon theory and the algebraic properties of random codes have resulted in novel research that sheds light on fundamental phase transition phenomena in several long-standing combinatorial and graph-theoretic problems.

The main goal of our seminar was to accelerate research in the growing field of coding theory for computation and learning, and maximize the transformative role of codes in non-traditional application areas. The seminar brought together 22 researchers from across the world specializing in information theory, machine learning, theoretical computer science, optimization, and statistics. The schedule for each day included a tutorial talk by a senior researcher, followed by shorter talks by participants on recent or ongoing work. The afternoons were devoted to informal breakout sessions for groups to discuss open questions. Two of the larger breakout sessions focused on distributed optimization and group testing.

Seminar participants reported that they enjoyed hearing about new ideas, as well as delving into deeper technical discussions about open problems in coding theory. Some topics deserving special mention include the use of techniques in statistical mechanics; locally decodable and recoverable codes; submodular function optimization; hypergraph clustering; private information retrieval; and contagion on graphs. All participants valued the ample time for discussions between and after talks, as it provided a fruitful atmosphere for collaborating on new topics.

Copyright Po-Ling Loh, Arya Mazumdar, Dimitris Papailiopoulos, and Rüdiger Urbanke

  • Dimitris Achlioptas (University of California - Santa Cruz, US) [dblp]
  • Alexander Barg (University of Maryland - College Park, US) [dblp]
  • Martin Bossert (Universität Ulm, DE) [dblp]
  • Elette Boyle (The Interdisciplinary Center - Herzliya, IL) [dblp]
  • Amin Coja-Oghlan (Goethe-Universität - Frankfurt am Main, DE) [dblp]
  • Anna Gál (University of Texas - Austin, US) [dblp]
  • Venkatesan Guruswami (Carnegie Mellon University - Pittsburgh, US) [dblp]
  • Hamed S. Hassani (University of Pennsylvania - Philadelphia, US) [dblp]
  • Sihuang Hu (RWTH Aachen, DE) [dblp]
  • Sidharth Jaggi (The Chinese University of Hong Kong, HK) [dblp]
  • Marc Lelarge (ENS - Paris, FR) [dblp]
  • Po-Ling Loh (University of Wisconsin - Madison, US) [dblp]
  • Nicolas Macris (EPFL - Lausanne, CH) [dblp]
  • Arya Mazumdar (University of Massachusetts - Amherst, US) [dblp]
  • Olgica Milenkovic (University of Illinois - Urbana Champaign, US) [dblp]
  • Dimitris Papailiopoulos (University of Wisconsin - Madison, US) [dblp]
  • Ankit Singh Rawat (MIT - Cambridge, US) [dblp]
  • Changho Suh (KAIST - Daejeon, KR) [dblp]
  • Itzhak Tamo (Tel Aviv University, IL) [dblp]
  • Rüdiger Urbanke (EPFL - Lausanne, CH) [dblp]
  • Pascal Vontobel (The Chinese University of Hong Kong, HK) [dblp]
  • Mary Wootters (Stanford University, US) [dblp]

  • artificial intelligence / robotics
  • data structures / algorithms / complexity
  • optimization / scheduling

  • Coding theory
  • distributed optimization
  • machine learning
  • sparse recovery
  • threshold phenomena