TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 22232

Efficient and Equitable Natural Language Processing in the Age of Deep Learning

( 06. Jun – 10. Jun, 2022 )

(zum Vergrößern in der Bildmitte klicken)

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/22232

Organisatoren

Kontakt



Programm

Summary

For this seminar, we brought together a diverse group of researchers and practitioners in NLP and adjacent fields to develop actionable policies, incentives and a joint strategy towards more efficient and equitable NLP. This Dagstuhl Seminar covered a range of related topics, which we summarize as follows.

Efficient NLP models

A key method for mitigating the raised concerns is reducing costs by making models more efficient. We surveyed the different methods that exist for making NLP technology more efficient. We discussed their tradeoffs, prioritized them, and aimed to identify new opportunities to promote efficiency in NLP. During the seminar, we drafted a survey paper summarizing multiple methods for increasing the efficiency of NLP models. We aim to publish this work later this year.

Systemic issues

We also addressed systemic issues in the field relating to the reporting of computational budgets in NLP research, and how we can use incentive structures such as the NLP Reproducibility Checklist [1] to motivate researchers throughout the field to improve reporting. We discussed the survey responses for the reproducibility checklist used at four major NLP conferences, and we plan to release a report of this data.

Equity of access

A third topic of discussion was the equity of access to computational resources and state-of-the-art NLP technologies. Prior to the seminar, we conducted a survey of different stakeholders across the NLP community. During the seminar, we analyzed and discussed the results of this survey to better understand who is most affected and how, and developed informed strategies and policies to mitigate this inequity moving forward. We are currently working on a paper summarizing the results of this survey, which we hope to publish later this year.

Measuring efficiency and equity

All of the above endeavors require establishing the right metrics and standards to measure our current status and progress towards efficiency and equity goals. We discussed multiple metrics and evaluation frameworks that capture the bigger picture of how different approaches compare in terms of energy efficiency not just in the research environment but in practice and over the entire ML model lifecycle (development, training and deployment), and that work under a wide range of computational budgets.

References

  1. Jesse Dodge, Suchin Gururangan, Dallas Card, Roy Schwartz, Noah A. Smith: Show Your Work: Improved Reporting of Experimental Results. EMNLP/IJCNLP (1) 2019: 2185-2194
Copyright Roy Schwartz, Jesse Dodge, Iryna Gurevych, and Emma Strubell

Motivation

Since 2012, the field of artificial intelligence (AI) has reported remarkable progress on a broad range of capabilities including object recognition, game playing, speech recognition, and machine translation. Much of this progress has been achieved by increasingly large and computationally intensive deep learning models: training costs for state-of-the-art deep learning models have increased 300,000 times between 2012 and 2018 [1]. Perhaps the epitome of this trend is the subfield of natural language processing (NLP) that over the past three years has experienced even sharper growth in model size and corresponding computational requirements in the word embedding approaches (e.g. ELMo, BERT, openGPT-2, Megatron-LM, T5, and GPT-3, one of the largest models ever trained with 175B dense parameters) that are now the basic building blocks of nearly all NLP models. Recent studies indicate that this trend is both environmentally unfriendly and prohibitively expensive, raising barriers to participation in NLP research [2, 3]. The goal of this seminar is to mitigate these concerns and promote equity of access in NLP. We plan to bring together a diverse group of researchers and practitioners in NLP and adjacent fields, and develop actionable policies, incentives and a joint strategy towards these goals.

This Dagstuhl Seminar will cover a range of topics relating to efficiency in NLP. A key method for mitigating the concerns raised above is reducing costs by making models more efficient. We will survey the different methods that exist for making NLP technology more efficient. We will discuss their tradeoffs, prioritize them, and aim to identify new opportunities to promote efficiency in NLP.

We will also address systemic issues in the field relating to the reporting of computational budgets in NLP research, and how we can use incentive structures such as the NLP Reproducibility Checklist to motivate researchers throughout the field to improve reporting. We have privileged access to the survey responses for the reproducibility checklist used at four major NLP conferences, and we will release a report of this data, in addition to planning the future direction of the checklist.

A third topic of discussion will be equity of access to computational resources and state-of-the-art NLP technologies. Prior to the seminar we will conduct a survey of different stakeholders across the NLP community. During the seminar we will analyze and discuss the results of this survey to better understand who is most affected and how, and develop informed strategies and policies to mitigate this inequity moving forward.

All of the above endeavors require establishing the right metrics and standards to measure our current status and progress towards efficiency and equity goals. We will devise metrics and evaluation frameworks that capture the bigger picture of how different approaches compare in terms of energy efficiency not just in the research environment but in practice and over the entire ML model lifecycle (development, training and deployment), and that work under a wide range of computational budgets.

The result of this seminar may include joint research publications, e.g., towards the efficiency of NLP models; conceptualization of an efficiency benchmark and the corresponding evaluation; and actionable policies informed by analysis of survey data.

[1] D. Amodei and D. Hernandez. 2018. AI and Compute. https://openai.com/blog/ai-and-compute/ [2] R. Schwartz, D. Dodge, N. A. Smith, and O. Etzioni. 2020. Green AI. Communications of the ACM (CACM). [3] E. Strubell, A. Ganesh, and A. McCallum. 2019. Energy and Policy Considerations for Deep Learning in NLP. In Proc. of ACL.

Copyright Jesse Dodge, Iryna Gurevych, Roy Schwartz, and Emma Strubell

Teilnehmer
  • Yuki Arase (Osaka University, JP)
  • Niranjan Balasubramanian (Stony Brook University, US)
  • Leon Derczynski (IT University of Copenhagen, DK)
  • Jesse Dodge (AI2 - Seattle, US)
  • Jessica Forde (Brown University - Providence, US)
  • Jonathan Frankle (Harvard University - Allston, US)
  • Iryna Gurevych (TU Darmstadt, DE) [dblp]
  • Michael Hassid (The Hebrew University of Jerusalem, IL)
  • Kenneth Heafield (University of Edinburgh, GB) [dblp]
  • Sara Hooker (Google - Mountain View, US)
  • Alexander Koller (Universität des Saarlandes, DE) [dblp]
  • Ji-Ung Lee (TU Darmstadt, DE) [dblp]
  • Alexander Löser (Berliner Hochschule für Technik, DE) [dblp]
  • Alexandra Sasha Luccioni (Hugging Face - Paris, FR)
  • André F. T. Martins (IST - Lisbon, PT) [dblp]
  • Haritz Puerto (TU Darmstadt, DE)
  • Colin Raffel (University of North Carolina at Chapel Hill, US) [dblp]
  • Nils Reimers (Hugging Face - Paris) [dblp]
  • Leonardo Ribeiro (TU Darmstadt, DE) [dblp]
  • Anna Rogers (University of Copenhagen, DK)
  • Andreas Rücklé (Amazon - Berlin, DE)
  • Roy Schwartz (The Hebrew University of Jerusalem, IL)
  • Edwin Simpson (University of Bristol, GB) [dblp]
  • Noam Slonim (IBM - Haifa, IL) [dblp]
  • Noah A. Smith (University of Washington - Seattle, US)
  • Emma Strubell (Carnegie Mellon University - Pittsburgh, US)
  • Betty van Aken (Berliner Hochschule für Technik, DE)
  • Thomas Wolf (Hugging Face - Paris, FR)

Klassifikation
  • Artificial Intelligence
  • Computation and Language
  • Machine Learning

Schlagworte
  • Natural language processing (NLP)
  • efficiency
  • equity
  • deep learning