TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 24052

Reviewer No. 2: Old and New Problems in Peer Review

( Jan 28 – Feb 02, 2024 )

(Click in the middle of the image to enlarge)

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/24052

Organizers

Contact

Dagstuhl Reports

As part of the mandatory documentation, participants are asked to submit their talk abstracts, working group results, etc. for publication in our series Dagstuhl Reports via the Dagstuhl Reports Submission System.

  • Upload (Use personal credentials as created in DOOR to log in)

Shared Documents

Schedule

Motivation

Peer review is the best mechanism for assessing scientific validity of new research that we have so far. But this mechanism has many well-known issues, such as the different incentives of the authors and reviewers, difficulties with preserving reviewer and author anonymity, confirmation and other cognitive biases that even researchers fall prey to. These intrinsic problems are exacerbated in interdisciplinary fields like Natural Language Processing (NLP) and Machine Learning (ML), where groups of researchers may vary so much in their methodology, terminology, and research agendas, that sometimes they have trouble even recognizing each other's contributions as "research".

This Dagstuhl Seminar will cover a range of topics related to organization of peer review in NLP and ML, including the following:

  • Improving the paper-reviewer matching by processes/algorithms that take into account both topic matches and reviewer interest in a given research question
  • Peer review vs methodological and demographic diversity in the interdisciplinary fields
  • Better practices for designing peer-review policies
  • Improving the structural incentives for reviewers
  • Use of NLP and ML for suitable automation of (parts of) the paper reviewing process
  • Peer-reviewing and research integrity

The seminar will serve as a point of reflection on decades of personal experience of the participants in organizing different kinds of peer-reviewed venues, enabling an in-depth discussion of what has been tried, what seems to work and what doesn't. It will also incorporate the fast-improving capabilities of NLP/ML systems. The outcomes of the seminar may include joint research publications on the methodological challenges of peer review, NLP and ML for intelligent support of peer-reviewing and actionable proposals, informed by the experience of participants as researchers as well as in various roles including chairs, editors, conference organizers, reviewers, and authors.

Copyright Iryna Gurevych, Anna Rogers, and Nihar B. Shah

Participants

Classification
  • Artificial Intelligence
  • Computation and Language
  • Machine Learning

Keywords
  • peer review
  • diversity
  • natural language processing
  • incentives