[PAST EVENT] What Aspects of Meaning are Missing from Current Natural Language Understanding Systems?
Title: What Aspects of Meaning are Missing from Current Natural Language Understanding Systems?
Abstract: Natural Language Understanding (NLU) has progressed greatly in the last several years, with models surpassing human performance on widely used benchmarks such as SuperGLUE. In this talk, I highlight two aspects of meaning that are still missing in our NLU systems despite their impressive success, and propose ways to move forward. The first is systematic understanding: through a semantic parsing task, I show that systems are biased towards assigning the meaning that they have already encountered during training, even when the structure of the sentence indicates otherwise. The second is understanding of non-asserted meaning: in the context of question-answering, I demonstrate that current systems are unable to handle questions containing failed presuppositions (backgrounded assumptions that need to be satisfied) such as Which linguist invented the lightbulb? sufficiently.
Bio: Najoung Kim is a PhD student in the Department of Cognitive Science at Johns Hopkins University, advised by Paul Smolensky and Kyle Rawlins. Her research is on diagnosing current natural language understanding systems to propose targeted solutions, leveraging insights from linguistics and cognitive science. In particular, she focuses on the understanding of implicit meaning and systematic generalization. Najoung's dissertation work is supported by the NSF, and she has interned at Google and IBM Research.
Prof. Zhenming Liu