Published on Thu Dec 18 2014

Incorporating Both Distributional and Relational Semantics in Word Representations

Daniel Fried, Kevin Duh

We investigate the hypothesis that word representations ought to incorporateboth distributional and relational semantics. We employ the Alternating Direction Method of Multipliers (ADMM), which flexibly optimizes a distributional objective on raw text and a relational objective on WordNet.

0
0
0
Abstract

We investigate the hypothesis that word representations ought to incorporate both distributional and relational semantics. To this end, we employ the Alternating Direction Method of Multipliers (ADMM), which flexibly optimizes a distributional objective on raw text and a relational objective on WordNet. Preliminary results on knowledge base completion, analogy tests, and parsing show that word representations trained on both objectives can give improvements in some cases.