Published on Tue Jan 29 2019

Universal Dependency Parsing from Scratch

Peng Qi, Timothy Dozat, Yuhao Zhang, Christopher D. Manning

This paper describes Stanford's system at the CoNLL 2018 UD Shared Task. The system takes raw text as input, and performs all tasks required by the shared task.

0
0
0
Abstract

This paper describes Stanford's system at the CoNLL 2018 UD Shared Task. We introduce a complete neural pipeline system that takes raw text as input, and performs all tasks required by the shared task, ranging from tokenization and sentence segmentation, to POS tagging and dependency parsing. Our single system submission achieved very competitive performance on big treebanks. Moreover, after fixing an unfortunate bug, our corrected system would have placed the 2nd, 1st, and 3rd on the official evaluation metrics LAS,MLAS, and BLEX, and would have outperformed all submission systems on low-resource treebank categories on all metrics by a large margin. We further show the effectiveness of different model components through extensive ablation studies.

Thu Sep 06 2018
NLP
82 Treebanks, 34 Models: Universal Dependency Parsing with Multi-Treebank Models
We present the Uppsala system for the CoNLL 2018 Shared Task on universal dependency parsing. On the official test run, we ranked 7th of 27 teams for the LAS and MLAS metrics. Our system obtained the best scores overall for word segmentation, universal POS tagging,
0
0
0
Wed Mar 15 2017
NLP
SyntaxNet Models for the CoNLL 2017 Shared Task
We describe a baseline dependency parsing system for the CoNLL2017 Shared Task. This system, which we call "ParseySaurus," uses the DRAGNN framework. It combines transition-based recurrent parsing and tagging with character-based word representations.
0
0
0
Mon Sep 12 2016
NLP
Read, Tag, and Parse All at Once, or Fully-neural Dependency Parsing
We present a dependency parser implemented as a single deep neural network. It reads orthographic representations of words and directly generates dependencies and their labels. Unlike typical approaches to parsing, the model doesn't require part-of-speech (POS) tagging.
0
0
0
Mon Mar 23 2015
NLP
Yara Parser: A Fast and Accurate Dependency Parser
Dependency parsers are among the most crucial tools in natural language processing. Yara can parse about 4000 sentences per second when in greedy mode (1 beam) When optimizing for accuracy (using 64 progressivelybeams and Brown cluster features), it can parse 45 sentences perSecond.
0
0
0
Mon Mar 16 2020
NLP
Stanza: A Python Natural Language Processing Toolkit for Many Human Languages
0
0
0
Wed Sep 02 2020
Artificial Intelligence
A Practical Chinese Dependency Parser Based on A Large-scale Dataset
Dependency parsing is a longstanding natural language processing task. Recently, neural network based NN-based dependency parsing has achieved significant progress. Baidu Chinese Treebank (DuCTB) consists of about one million annotated sentences.
0
0
0