https://www.dagstuhl.de/9716

14. – 18. April 1997, Dagstuhl-Seminar 9716

Evaluation of Multimedia Information Retrieval

Organisator

N. Fuhr (Dortmund), K. v. Rijsbergen (Glasgow), A. F. Smeaton (Dublin)

Auskunft zu diesem Dagstuhl-Seminar erteilt

Dagstuhl Service Team

Dokumente

Dagstuhl-Seminar-Report 175

Motivation

Information retrieval (IR), and IR tasks like information filtering and categorisation, have along tradition of implementation and empirical evaluation based on attempting to computationally replicate users’ relevance judgements. This usually involves implementing indexing and retrieval tasks on a test collection of documents and running a test collection of queries with known relevance judgements against this data. Information retrieval functionality has been evaluated quantitatively by computing how well a system performs against these known relevant documents and measuring this performance in terms of precision and recall. This classical approach to evaluation assumes a test design that resembles retrieval in batch mode. Performing experiments in artificial and synthetic environments, away from real users and real user interactions, has been the dominant mode of operation for empirical information retrieval research for decades.

Recently, information retrieval has expanded its scope to include, among others, indexing and retrieval of non-text (multimedia) documents based on external descriptors (captions) or on properties of the raw data itself. In addition, documents themselves are no longer homogeneous either in size, structure, organisation or style. As computing power increases, more computationally intensive techniques have been applied based on natural language processing, neural networks, machine learning, user modelling, intelligent and adaptive interfaces, new HCI paradigms, etc. All this means that evaluating the performance and effectiveness of an information retrieval application in terms of its precision and recall as done traditionally and in benchmarking exercises such as TREC (the DARPA-sponsored Text Retrieval Conference in which many of our participants take part), is becoming dated and unsuitable given the changed nature of the information itself. Additionally, retrieval is now highly interactive, and with multimedia documents, interactiveness plays an even more important role.

There is a clear gap in the field if information retrieval is to remain healthy and able to cope with the change in the nature of information and the nature of the user-system interaction. Yet the most important criterion for IR theories and IR research will always be the retrieval effectiveness of implementations, which can be validated only empirically and unless evaluation of modern information retrieval can handle this there is a fundamental deficiency in the field.

The purpose of the Dagstuhl-seminar on "Evaluation of Multimedia Information Retrieval" was to address the unsuitability of using the traditional evaluation metrics by bringing together researchers from different but overlapping and very related fields; empirical IR, HCI and user modelling. It is only by bringing together such people, in one place and at one time for an extended workshop, than we hope to make the breakthrough needed with consensus so that we can start to evaluate multimedia information retrieval as it should be done.

The format of our meeting was to divide each session into a “lead presentation” followed by a set of shorter, focussed presentations. This was then followed by either a set of group meetings of 5-6 people in each group, or a plenary discussion. We also had a number of “boiling pot” sessions where the theme for discussion emerged during the week. This loose format of organising the schedule worked well for a Dagstuhl-seminar like ours where we tend to ask more questions than provide real answers. The abstracts of the presentations made during the week are included in this booklet. The real impact of the Dagstuhl will be felt in time.

Dokumentation

In der Reihe Dagstuhl Reports werden alle Dagstuhl-Seminare und Dagstuhl-Perspektiven-Workshops dokumentiert. Die Organisatoren stellen zusammen mit dem Collector des Seminars einen Bericht zusammen, der die Beiträge der Autoren zusammenfasst und um eine Zusammenfassung ergänzt.

 

Download Übersichtsflyer (PDF).

Dagstuhl's Impact

Bitte informieren Sie uns, wenn eine Veröffentlichung ausgehend von Ihrem Seminar entsteht. Derartige Veröffentlichungen werden von uns in der Rubrik Dagstuhl's Impact separat aufgelistet  und im Erdgeschoss der Bibliothek präsentiert.

Publikationen

Es besteht weiterhin die Möglichkeit, eine umfassende Kollektion begutachteter Arbeiten in der Reihe Dagstuhl Follow-Ups zu publizieren.