Speaker: Hanna Hajishirzi, University of Washington
Title: Open Language Model (OLMo): The science of Language models and language models for science
Date: Friday, November 17, 2023 - 1:30 - 2:30 PM EST (North American Eastern Standard Time) via Zoom. On campus attendees will gather in CS 151 to view the presentation.
Over the past few years, and especially since the deployment of ChatGPT in November 2022, neural language models with billions of parameters and trained on trillions of words are powering the fastest-growing computing applications in history and generating discussion and debate across society. However, AI scientists cannot study or improve those state-of-the-art models because the models' parameters, training data, code, and even documentation are not openly available. In this talk, I present our OLMo project toward building strong language models and making them fully open to researchers along with open-source code for data management, training, inference, and interaction. In particular, I describe DOLMa, a 3T token open dataset curated for training language models, Tulu, our instruction-tuned language model, and OLMo v1, a fully-open 7B parameter language model.
Bio: Hanna Hajishirzi is a Torode Family Associate Professor at UW CSE and a Senior Director at AI2. Her research spans different areas in NLP and AI, specifically understanding and advancing large language models. Honors include the NSF CAREER Award, Sloan Fellowship, Allen Distinguished Investigator Award, Intel rising star award, UIUC Alumni award, multiple best paper and honorable mention paper awards, and several industry research faculty awards. Hanna received her PhD from University of Illinois and spent a year as a postdoc at Disney Research and CMU.
Zoom Link: Subscribe to mailing list (details above) for Zoom Link/Passcode notifications; or click here for Zoom link and reach out to Hamed Zamani for the passcode.