Semantic parsing, the task of mapping sentences to logical form meaning representations, has recently received significant attention. We propose a grammar induction technique for AMR semantic parsing. While previous grammar induction techniques were designed to re-learn a new parser for each target application, the recently annotated AMR Bank provides a unique opportunity to induce a single model for understanding broad-coverage newswire text and support a wide range of applications. We present a new model that combines CCG parsing to recover compositional aspects of meaning and a factor graph to model non-compositional phenomena, such as anaphoric dependencies. Our approach achieves 66.2 Smatch F1 score on the AMR bank, significantly outperforming the previous state of the art. This talk is an extended version of our EMNLP 2015 talk.
Yoav Artzi is an Assistant Professor in the Department of Computer Science and Cornell Tech at Cornell University. His research interests are in the intersection of natural language processing and machine learning. In particular, he focuses on designing latent variable learning algorithms that recover rich representations of linguistic meaning for situated natural language understanding. He received the best paper award in EMNLP 2015, his B.Sc. summa cum laude from Tel Aviv University, and his Ph.D. from the University of Washington.