Seminars

Feb
9
Tue
Alona Fyshe (University of Victoria) “Corpora, Cognition and Composition” @ Hackerman Hall B17
Feb 9 @ 12:00 pm – 1:15 pm

Abstract:

The action of reading, understanding and combining words to create meaningful phrases comes naturally to most people. Still, the processes that govern semantic composition in the human brain are not well understood. In this talk, I will explore semantics (word meaning) and semantic composition (combining the meaning of multiple words) using two data sources: a large text corpus, and brain recordings of people reading adjective noun phrases. Through the learning of latent representations, I will show that these two very different data sources are both consistent (they contain overlapping information), and are complementary (they contain non-overlapping, but still congruent information). These disparate data sources can be used together to further the study of semantics and semantic composition as grounded in the brain, or more abstractly as represented in patterns of word usage in corpora.

Biography:

Alona Fyshe is an Assistant Professor in the Computer Science Department at the University of Victoria. Alona received her BSc and MSc in Computing Science from the University of Alberta, and a PhD in Machine Learning from Carnegie Mellon University. Alona uses machine learning to leverage large amounts of text and neuroimaging data to understand how people mentally combine words to create higher-order meaning. Alona works on the semantics of adjective-noun phrases, and is now exploring how composition impacts sentiment.

http://web.uvic.ca/~afyshe/

Feb
16
Tue
Richard Socher (MetaMind) @ Hackerman Hall B17
Feb 16 @ 12:00 pm – 1:15 pm
Feb
23
Tue
KyungHyun Cho (New York University) “Future (?) of Machine Translation” @ Hackerman Hall B17
Feb 23 @ 12:00 pm – 1:15 pm

Abstract:

It is quite easy to believe that the recently proposed approach to machine translation, called neural machine translation, is simply yet another approach to statistical machine translation. This belief may drive research effort toward (incrementally) improving the existing neural machine translation system to outperform, or perform comparably to, the existing variants of phrase-based systems. In this talk, I aim to convince you otherwise. I argue that neural machine translation is not here to compete against the existing translation systems, but to open new opportunities in the field of machine translation. I will discuss three opportunities; (1) sub-word-level translation, (2) larger-context translation and (3) multilingual translation.
Biography:
Kyunghyun Cho is an assistant professor of Computer Science and Data Science at New York University (NYU). Previously, he was a postdoctoral researcher at the
University of Montreal under the supervision of Prof. Yoshua Bengio after obtaining a doctorate degree at Aalto University (Finland) in early 2014. Kyunghyun’s main research interests include neural networks, generative models and their applications, especially, to language understanding.

Center for Language and Speech Processing