Please join us for another NLP Seminar at 11:00 am at Soda 380 on Tuesday, Oct 8.

Speaker: Alexander Rush (Cornell)

Title: Revisiting Grammar Induction

Abstract:

Deep learning for NLP has become synonymous with global models trained with unlimited data. These models are incredible; however, they seem unlikely to tell us much about the way they (or language) work. Less heralded has been the ways in which deep methods have helped with inference in classical factored models. In this talk, I revisit the problem of grammar induction, an important benchmark task in NLP, using a variety of variational methods. Recent work shows that these methods greatly increase the performance of unsupervised learning methods. I argue that these approaches can be used in conjunction with global models to provide control in modern systems.

Biography:

Alexander Sasha Rush is an Associate Professor at Cornell Tech. His group’s research is in the intersection of natural language processing, deep learning, and structured prediction with applications in machine translation, summarization, and text generation. He also supports open-source development including the OpenNMT project. His work has received several paper and demo awards at major NLP and visualization conferences, an NSF Career Award, and faculty awards.  He is currently the general chair of ICLR.