Published on Tue Mar 14 2017

Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling

Diego Marcheggiani, Ivan Titov

Semantic role labeling (SRL) is the task of identifying the predicate-argument structure of a sentence. As the semantic representations are closely related to syntactic ones, we exploit syntactic information in our model. We propose a version of graph convolutional networks (GCNs), a

0
0
0
Abstract

Semantic role labeling (SRL) is the task of identifying the predicate-argument structure of a sentence. It is typically regarded as an important step in the standard NLP pipeline. As the semantic representations are closely related to syntactic ones, we exploit syntactic information in our model. We propose a version of graph convolutional networks (GCNs), a recent class of neural networks operating on graphs, suited to model syntactic dependency graphs. GCNs over syntactic dependency trees are used as sentence encoders, producing latent feature representations of words in a sentence. We observe that GCN layers are complementary to LSTM ones: when we stack both GCN and LSTM layers, we obtain a substantial improvement over an already state-of-the-art LSTM SRL model, resulting in the best reported scores on the standard benchmark (CoNLL-2009) both for Chinese and English.

Mon Apr 23 2018
NLP
Exploiting Semantics in Neural Machine Translation with Graph Convolutional Networks
We are the first to incorporate information about predicate-argument structure of source sentences into neural machine translation. We use Graph Convolutional Networks (GCNs) to inject a semantic bias into sentence encoders.
0
0
0
Sat Apr 22 2017
NLP
Deep Multitask Learning for Semantic Dependency Parsing
We present a deep neural architecture that parses sentences into three semantic dependency graph formalisms. By using efficient, nearly arc-factoredference and a bidirectional-LSTM, our system is able to significantly improve the state of the art.
0
0
0
Sat Sep 21 2019
Machine Learning
Graph Convolutions over Constituent Trees for Syntax-Aware Semantic Role Labeling
Most semantic-role formalisms are built upon constituent syntax and only syntactic constituents can be labeled as arguments. All the recent work on syntax-aware SRL relies on dependency representations of syntax. In contrast, we show how graph convolutional networks (GCNs) can be used
0
0
0
Sat Sep 12 2020
NLP
Syntax Role for Neural Semantic Role Labeling
Semantic role labeling (SRL) is dedicated to recognizing the semantic predicate-argument structure of a sentence. Previous studies have shown syntactic information can make remarkable contributions to SRL performance. This paper intends to quantify the importance of syntactic INFORMATION for neural SRL in the deep learning
0
0
0
Sat Apr 15 2017
NLP
Graph Convolutional Encoders for Syntax-aware Neural Machine Translation
We present a simple and effective approach to incorporating syntactic structure into neural attention-based encoder-decoder models for machine translation. We rely on graph-convolutional networks (GCNs), a recent class of neural networks developed for modeling graph-structured data. GCNs use
0
0
0
Thu Nov 26 2020
Machine Learning
Encoding Syntactic Constituency Paths for Frame-Semantic Parsing with Graph Convolutional Networks
We study the problem of integrating syntactic information from constituency trees into a neural model in Frame-semantic parsing sub-tasks. We use a Graph Convolutional Network to learn specific representations of constituents, such that each constituent is profiled as the production grammar rule it corresponds
0
0
0