29.10.17 - 03.11.17, Seminar 17442

Towards Cross-Domain Performance Modeling and Prediction: IR/RecSys/NLP

Diese Seminarbeschreibung wurde vor dem Seminar auf unseren Webseiten veröffentlicht und bei der Einladung zum Seminar verwendet.


Information systems, which manage, access, extract and process non-structured information, typically deal with vague and implicit information needs, natural language and complex user tasks. Examples of such systems are information retrieval (IR) systems, search engines, recommender systems (RecSys), machine translation, and so forth. The discipline behind these systems differs from other areas of computer science, and other fields of science and engineering in general, due to the lack of models that allow us to predict system performances in a specific operational context and to design systems ahead to achieve a desired level of effectiveness. In the type of information systems we want to look at, we deal with domains characterized by complex algorithms, dependent on many parameters and confronted with uncertainty both in the information to be processed and the needs to be addressed, where the lack of predictive models is somehow bypassed by massive trials of as many combinations as possible.

These approaches relying on massive experimentation, construction of testbeds, and heuristics are neither indefinitely scaled as the complexity of systems and tasks increases nor applicable outside the context of big Internet companies, which still have the resources to cope with them.

This Dagstuhl Perspectives Workshop will deal with the problem of modelling and predicting performances of information retrieval systems, recommender systems, and natural language processing (NLP) systems. This is an important and open issue common to all these three neighboring fields and it prevents both a deep scientific understanding and an effective engineering of such systems. Progress in modelling and prediction would allow us to better design such systems to achieve desired performance under given operational conditions.

We believe that bringing together these three communities has strong potential for advancing research on predictive models of performance. Indeed, the idea of predictive modeling is not new – each discipline has articulated the need and taken steps in this direction but these efforts have not succeeded yet.

We will challenge participants against some topics relevant to the prediction of system performances, among which:

  • Characterisation of Corpora before Exploitation
  • Characterisation of IR/NLP/Rec Systems before Exploitation
  • Beyond A-B testing
  • Estimating Performance without A-B Testing
  • Using Simulation to Predict Performance
  • Can Deep Learning replace actual user performance measurement?
  • Predicting Human Experience and Performance
  • Rich Metrics: Moving from Correctness to Usefulness
  • Performance guarantees vs. expected average performance
  • Performance-related axioms and proofs

The result we seek, therefore, is a trans-disciplinary research agenda that draws from the combination of intense focus and the exchange of ideas. This "manifesto" will include steps that can be taken within as well as across disciplines, and we hope it can serve as a roadmap for both researchers and research funders.

Creative Commons BY 3.0 Unported license
Nicola Ferro, Norbert Fuhr, Gregory Grefenstette, and Joseph A. Konstan