Reading Group

  1. Networks with attention
    Attention plays an important role in human's live, but how to enrich machines with such capability. Two main motivations behind an attention mechanism are: higher capacity of a network, and speed. This talk discusses a few publications, grouped into three categories, that attempt to couple attention with deep architectures.

  2. Semantic parsing via paraphrasing
    The holy grail of NLP is language understanding by machines. But how to represent the meaning? Semantic parsers represent it with a pre-defined formal language that can easily be executed by machines (e.g. SQL), and next learn to fit formulas to textual questions to retrieve the answer from a knowledge base. Although originally an expensive corpus of textual questions and the corresponding formulas was required, this talk discusses a very recent approach to train a semantic parser solely from textual question-answer pairs.