https://www.dagstuhl.de/01301

July 22 – 27 , 2001, Dagstuhl Seminar 01301

Inference Principles and Model Selection

Organizers

Joachim M. Buhmann (ETH Zürich, CH)
Bernhard Schölkopf (MPI für biologische Kybernetik – Tübingen, DE)


The Dagstuhl Foundation gratefully acknowledges the donation from:

  • Biowulf Technologies, Savannah, USA

For support, please contact

Dagstuhl Service Team

Documents

List of Participants
External Homepage
Dagstuhl-Seminar-Report 315

Inference and induction denote the process of inferring an underlying dependence from empirical observations. They have been of interest to philosophy and scientific endeavour since the ancient times. To model this process, a number of statistical models have been developed. Examples thereof are the Bayesian approach of storing all plausible models and averaging them according to a posterior distribution, and the Occam's razor approach of searching for the simplest explanation of the observations, as implemented by MDL (minimum description length) and Vapnik-Chervonenkis-Theory. Despite the superficial differences, there exist common ideas which make it worthwhile to record a snapshot of where we are, where we want to go, and how we plan to achieve this.

At the same time, technological applications of induction in machine learning systems have extensively been explored algorithmically, highlighting the importance of issues which had typically not been the concern of philosophy. Practical problems of inference are concerned with noise in the data and with the issue of overfitting, i.e. extracting more structure from the data than is supported by it. Algorithms have to select a model from a large set of potential interpretations of the data. Model averaging, noise robustness, overfitting, capacity and other concepts play a central role in many of the theories.

The aims of the seminar revolve around deepening our understanding of the following set of questions

  • Can the different formalizations of inference be placed in a broader framework and perhaps seen as different views of a unified theory?
  • Do the recent developments shed new light on the question of induction as studied historically?
  • Are there notions of inference studied in philosophy that machine learning has overlooked?

The workshop focuses on the long-term perspective of Machine Learning and its impact on Computer Science, Statistics, Mathematics and Philosophy, rather than on the latest implementations or sophisticated technical details. Participants are encouraged to stimulate the discussion with a single slide that contains what they consider the crucial open problem, insight, or idea.

Focus Topics, Tutorials and Contributions:

Each half day will be devoted to one topic, starting with a tutorial. Attendees will then have the possibility to contribute in discussions, with short impromptu talks, or by presenting open problems. There will be the possibility of additional (demand-driven) sessions in the evenings. The following SESSIONS with a one hour tutorial have been planned for each half day:

7 Foundations of Inference

  • Bayesian Inference
  • Model Averaging and PAC-Bayesian inference
  • Statistical Mechanics Approaches
  • Structural Risk Minimization
  • Density Estimation
  • Online learning
  • Open inference problems in bioinformatics
  • Regularization theory
  • Reinforcement Learning

Documentation

In the series Dagstuhl Reports each Dagstuhl Seminar and Dagstuhl Perspectives Workshop is documented. The seminar organizers, in cooperation with the collector, prepare a report that includes contributions from the participants' talks together with a summary of the seminar.

 

Download overview leaflet (PDF).

Publications

Furthermore, a comprehensive peer-reviewed collection of research papers can be published in the series Dagstuhl Follow-Ups.

Dagstuhl's Impact

Please inform us when a publication was published as a result from your seminar. These publications are listed in the category Dagstuhl's Impact and are presented on a special shelf on the ground floor of the library.

NSF young researcher support