Published on Tue Nov 28 2017

Hybrid Oracle: Making Use of Ambiguity in Transition-based Chinese Dependency Parsing

Xuancheng Ren, Xu Sun

In the training of transition-based dependency parsers, an oracle is used to predict a transition sequence for a sentence and its gold tree. However, the transition system may exhibit ambiguity, that is, there can be multiple correct transition sequences. We propose to make use of the property

0
0
0
Abstract

In the training of transition-based dependency parsers, an oracle is used to predict a transition sequence for a sentence and its gold tree. However, the transition system may exhibit ambiguity, that is, there can be multiple correct transition sequences that form the gold tree. We propose to make use of the property in the training of neural dependency parsers, and present the Hybrid Oracle. The new oracle gives all the correct transitions for a parsing state, which are used in the cross entropy loss function to provide better supervisory signal. It is also used to generate different transition sequences for a sentence to better explore the training data and improve the generalization ability of the parser. Evaluations show that the parsers trained using the hybrid oracle outperform the parsers using the traditional oracle in Chinese dependency parsing. We provide analysis from a linguistic view. The code is available at https://github.com/lancopku/nndep .

Wed Mar 20 2019
NLP
Left-to-Right Dependency Parsing with Pointer Networks
We propose a novel transition-based algorithm that straightforwardly parses sentences from left to right. We use the pointer network framework that, given a word, can directly point to a position from the sentence. This results in a quadratic non-projective parser that
0
0
0
Tue Aug 28 2018
NLP
Universal Dependency Parsing with a General Transition-Based DAG Parser
TUPA is a general neural transition-based DAG parser. TUPA was designed for parsing UCCA, a cross-linguistic semantic annotation scheme. By converting UD trees and graphs to a like DAG format, we train T UPA almost without modification.
0
0
0
Wed Jan 22 2020
Artificial Intelligence
Transition-Based Dependency Parsing using Perceptron Learner
Syntactic parsing using dependency structures has become a standard technique. We tackle transition-based dependency parsing using a Perceptron Learner. Our proposed model, which adds more relevant features, outperforms a baseline arc-standard parser.
0
0
0
Mon Jul 18 2016
NLP
Dependency Language Models for Transition-based Dependency Parsing
We present an approach to improve the accuracy of a strong transitioning-based dependency parser by exploiting dependency language models. We gained a large absolute improvement of one point (UAS) on Chinese and 0.5 points for English.
0
0
0
Fri Apr 22 2016
Neural Networks
Dependency Parsing with LSTMs: An Empirical Evaluation
We propose a transition-based dependency parser using Recurrent Neural Networks with Long Short-Term Memory (LSTM) units. This extends the feedforward neuronal network parser of Chen and Manning (2014) and enables modelling of sequences of shift/reduce transition decisions.
0
0
0
Wed Aug 30 2017
NLP
Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set
We present a minimal feature set for transition-based dependency parsing. We plug our feature set into the dynamic-programming framework of Huang and Sagae (2010) and Kuhlmann et al. (2011) to produce the first implementation of O(n^3) exact
0
0
0