https://www.dagstuhl.de/22172
24. – 27. April 2022, Dagstuhl-Seminar 22172
Technologies to Support Critical Thinking in an Age of Misinformation
Organisatoren
Andreas Dengel (DFKI – Kaiserslautern, DE)
Laurence Devillers (CNRS – Orsay, FR & Sorbonne University – Paris, FR)
Tilman Dingler (The University of Melbourne, AU)
Koichi Kise (Osaka Prefecture University, JP)
Benjamin Tag (The University of Melbourne, AU)
Auskunft zu diesem Dagstuhl-Seminar erteilen
Simone Schilke zu administrativen Fragen
Andreas Dolzmann zu wissenschaftlichen Fragen
Dagstuhl Reports
Wir bitten die Teilnehmer uns bei der notwendigen Dokumentation zu unterstützen und Abstracts zu ihrem
Vortrag, Ergebnisse aus Arbeitsgruppen, etc. zur Veröffentlichung in unserer Serie Dagstuhl Reports
einzureichen über unser
Dagstuhl Reports Submission System.
Dokumente
Teilnehmerliste
Gemeinsame Dokumente
Dagstuhl-Seminar Wiki
Programm des Dagstuhl-Seminars [pdf]
(Zum Einloggen bitte persönliche DOOR-Zugangsdaten verwenden)
Motivation
Misinformation and fake news are roaming the Internet in abundance. Characterised as factually incorrect information that is intentionally manipulated to deceive the receiver, it often challenges our ability to tell fake from truth. New technology has eased the distribution of misinformation and enabled governments, organisations, and individuals to influence public opinion. Technology, however, also offers governments and organisations new avenues for detecting and correcting false information.
The very same technologies that are used to collect large amounts of personal information and target users’ cognitive vulnerabilities also offer intelligent solutions to the problem of misinformation. Pattern recognition and Natural Language Processing have made fact-checking applications and spam filters more accurate and reliable. Machine Learning, big data, and context-aware computing systems can be used to detect misinformation in-situ and provide cognitive security. Today, these self-learning systems protect the user and prevent misinformation from finding fertile ground. Researchers and practitioners in Human-Computer Interaction are at the forefront of designing and developing user-facing computing systems. Consequently, we bear special responsibility for working on solutions to mitigate problems arising from misinformation and bias-enforcing interfaces.
This Dagstuhl Seminar aims to bring together designers, developers, practitioners, and thinkers across disciplines to discuss and devise solutions in the form of technologies and applications that instil and nurture critical thinking in their users. With a focus on misinformation, we will explore users’ vulnerabilities in order to discuss and design solutions to keep users safe from manipulation, i.e., provide cognitive security. Over three days, an esteemed selection of about 30 participants will engage with the problem of misinformation and re-think computing systems to re-think the incentive structures and mechanisms of social computing systems with particular regard to news media and how people encounter and process misinformation. By looking at systems, users, and applications from an interdisciplinary perspective, we aim to produce a research agenda and blueprints for systems that provide transparency, contribute to advancing technology and media literacy, build critical thinking skills, and depolarise by design.
Motivation text license Creative Commons BY 4.0
Andreas Dengel, Laurence Devillers, Tilman Dingler, Koichi Kise, and Benjamin Tag
Classification
- Human-Computer Interaction
- Other Computer Science
- Social And Information Networks
Keywords
- Cognitive Security
- Misinformation
- Bias Computing