TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 24211

Evaluation Perspectives of Recommender Systems: Driving Research and Education

( 20. May – 24. May, 2024 )

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/24211

Organisatoren

Kontakt

Dagstuhl Seminar Wiki

Gemeinsame Dokumente

Programm
  • Upload (Use personal credentials as created in DOOR to log in)

Motivation

Evaluation is an important cornerstone in the process of researching, developing, and deploying recommender systems. This Dagstuhl Seminar aims to shed light on the different and potentially diverging or contradictory perspectives on the evaluation of recommender systems. Building on the discussions and outcomes of the PERSPECTIVES workshop series held at ACM RecSys 2021-2023, the seminar will bring together academia and industry to critically reflect on the state of the evaluation of recommender systems and create a setting for development and growth.

While recommender systems is largely an applied field, their evaluation builds on and intersects theories from information retrieval, machine learning, and human-computer interaction. Historically, the theories and evaluation approaches in these fields are very different. Thoroughly evaluating recommender systems requires integrating all perspectives. Hence, this seminar will bring together experts from these fields and serve as a vehicle for discussing and developing the state-of-the-art and practice of evaluating recommender systems. The seminar will set the ground for developing recommender systems evaluation metrics, methods, and practices through collaborations and discussions between participants from diverse backgrounds, e.g., academic and industry researchers, industry practitioners, senior and junior. We emphasize the importance of getting and keeping the big picture of a recommender system’s performance in its context of use, for which it is ultimate to incorporate the technical and the human element.

We will set the basis for the next generation of researchers, apt to evaluate and advance recommender systems thoroughly

Copyright Christine Bauer, Alan Said, and Eva Zangerle

Teilnehmer

Klassifikation
  • Human-Computer Interaction
  • Information Retrieval
  • Machine Learning

Schlagworte
  • Recommender Systems
  • Evaluation
  • Information Retrieval
  • User Interaction
  • Intelligent Systems