Agenda

Expeditions in Experiential AI Series | On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?
29/sep

Seminar series hosted by the Institute for Experiential AI: Expeditions in Experiential AI and the Distinguished Lecturer Series and organized by Dr. Ricardo Baeza-Yates

RESUME

In this presentation, Bender and her co-authors take stock of the recent trend towards ever larger language models (especially for English), which the field of natural language processing has been using to extend the state of the art on a wide array of tasks as measured by leaderboards on specific benchmarks. The authors take a step back and ask: How big is too big? What are the possible risks associated with this technology and what paths are available for mitigating those risks?

SPEAKER

Emily M. Bender is an American linguist who works on multilingual grammar engineering, technology for endangered language documentation, computational semantics, and methodologies for supporting consideration of impacts language technology in NLP research, development, and education. She is the Howard and Frances Nostrand Endowed Professor of Linguistics at the University of Washington. Her work includes the LinGO Grammar Matrix, an open-source starter kit for the development of broad-coverage precision HPSG grammars; data statements for natural language processing, a set of practices for documenting essential information about the characteristics of datasets; and two books which make key linguistic principles accessible to NLP practitioners: Linguistic Fundamentals for Natural Language Processing: 100 Essentials from Morphology and Syntax (2013) and Linguistic Fundamentals for Natural Language Processing II: 100 Essentials from Semantics and Pragmatics (2019, with Alex Lascarides).

WHEN AND WHERE

September 29 | 1:00pm-2:00pm EDT (14.00 Santiago, Chile)

Registrations: https://stochastic-parrots.splashthat.com/