List of Past Events
You Shall Know a Logical Form by the Company it Keeps (talk recording available)
Dr. Katrin Erk
Tuesday, April 22, 2014, 01:00pm - 02:00pm
University of Texas at Austin, Department of Linguistics
Logic-based semantics provides precise, structured characterizations of the meaning of natural language sentences, including the handling of quantifiers, negation, embedded propositions, and so on. It has been out of favor in computational linguistics for a while for being too deep and "brittle", but with more complex tasks like Textual Entailment, it is attracting interest again. Distributional models represent the meaning of words and phrases through the contexts in which they have been observed. They offer wide-coverage lexical representations, and they can talk about different degrees of similarity -- both capabilities that have traditionally been lacking in logic-based approaches, and that are sorely needed. But distributional models still have trouble capturing the structure of sentences. We combine the two frameworks by turning distributional similarity sim(A, B) into weighted inference rules A -> B, and doing probabilistic inferences with Markov Logic Networks on the resulting weighted formulas.
The combination of logical form and distributional semantics leads to interesting practical problems on both the logical and the distributional side. It also leads to theoretical questions. In the distributional tradition, the meaning of a word is given by its use. In logic-based semantics, the notions of truth, entailment, and reference are central. What does it mean to combine the two? I will argue that distributional models are connected to reference after all if we view distributional similarity ratings as soft meaning postulates.
To view a recording of this talk click here (You will need a Rutgers NetID and password)