Please join us for another NLP Seminar at 4:00pm in 202 South Hall on Sept 30th.

Speaker: Jinfeng Rao (Facebook)

Title: Structure-Aware Learning and Decoding for Neural NLG in Task-Oriented Dialog

Abstract:

Generating fluent natural language responses from structured semantic representations is a critical step in task-oriented conversational systems. Previous work primarily use Seq2Seq models on flat meaning representations (MR), e.g., in the E2E NLG Challenge, which lacks of controllability of generated texts. We propose a tree-structured MR for better discourse-level structuring and sentence-level planning. We propose a constrained decoding and a tree-to-sequence approach to add structure constraints into model learning and decoding. Our experiments show both approaches lead to better semantic correctness and combining them achieves the best performance.

In addition, I will also briefly talk about my recent work on bridging the gap between relevance matching and semantic matching for short text similarity modeling.

Biography:

Jinfeng Rao is currently a research scientist at Facebook Conversational AI. Before that, he was a visiting researcher at Stanford University. He obtained his PhD with Prof. Jimmy Lin from University of Maryland College Park. Jinfeng’s research interest lies at the intersection of natural language processing, information retrieval and deep learning. At Facebook, he focuses on building and shipping world-class NLG system in Assistant. He has published more than 20 articles in the major NLP/ML conferences, including ACL, EMNLP, KDD, etc. His past research helped Comcast build their XFINITY voice search system, where his proposed multi-task system has processed billions of voice queries from 20M+ voice remotes in 2019. His work also helped Comcast win the 69th Emmy Award (2017) for the technical contributions in advancing television technologies.

(Slides, available to those with @berkeley.edu email)