Published on Fri Mar 08 2019

Towards Time-Aware Distant Supervision for Relation Extraction

Tianwen Jiang, Sendong Zhao, Jing Liu, Jin-Ge Yao, Ming Liu, Bing Qin, Ting Liu, Chin-Yew Lin

Time-DS is composed of a time series and two strategies. Instance-popularity is to encode the strong relevance of time and true relation mention. The two strategies, i.e., hard filter and curriculum learning are both ways to implement instance-pop popularity for better relation extraction.

0
0
0
Abstract

Distant supervision for relation extraction heavily suffers from the wrong labeling problem. To alleviate this issue in news data with the timestamp, we take a new factor time into consideration and propose a novel time-aware distant supervision framework (Time-DS). Time-DS is composed of a time series instance-popularity and two strategies. Instance-popularity is to encode the strong relevance of time and true relation mention. Therefore, instance-popularity would be an effective clue to reduce the noises generated through distant supervision labeling. The two strategies, i.e., hard filter and curriculum learning are both ways to implement instance-popularity for better relation extraction in the manner of Time-DS. The curriculum learning is a more sophisticated and flexible way to exploit instance-popularity to eliminate the bad effects of noises, thus get better relation extraction performance. Experiments on our collected multi-source news corpus show that Time-DS achieves significant improvements for relation extraction.

Thu May 11 2017
NLP
Learning with Noise: Enhance Distantly Supervised Relation Extraction with Dynamic Transition Matrix
Distant supervision significantly reduces human efforts in building training data for many classification tasks. While promising, this technique often introduces noise to the generated training data, which can severely affect the model performance. In this paper, we take a deep look at the application of distant supervision in relation extraction.
0
0
0
Tue Mar 10 2020
Machine Learning
Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction
The Transformer block is used as the sentence encoder to capture syntactic information of sentences. Then, a more concise sentence-level attention mechanism is adopted to constitute the bag representation. Experimental results show that the proposed approach can outperform the state-of-the-art algorithms.
0
0
0
Sat Oct 24 2020
Machine Learning
Effective Distant Supervision for Temporal Relation Extraction
A principal barrier to training temporal relation extraction models is the lack of varied, high quality examples. We present a method of automatically collectingantly-supervised examples of temporal relations. We scrape and automatically label event pairs where the temporal relations are made explicit.
0
0
0
Wed Oct 21 2020
NLP
KnowDis: Knowledge Enhanced Data Augmentation for Event Causality Detection via Distant Supervision
Modern models of event causality detection (ECD) are mainly based on supervised learning from small hand-labeled corpora. We investigate a data augmentation framework for ECD dubbed as Knowledge Enhanced Distant Data Augmentation (KnowDis)
0
0
0
Sun Nov 08 2020
NLP
Denoising Relation Extraction from Document-level Distant Supervision
Distant supervision (DS) has been widely used to generate auto-labeled data for sentence-level relation extraction (RE) We propose a novel pre-trained model for DocRE, which denoises the document-level DS data via multiple pre-training tasks.
0
0
0
Fri May 21 2021
NLP
Revisiting the Negative Data of Distantly Supervised Relation Extraction
Distantly supervision automatically generates plenty of training samples for relation extraction. However, it also incurs two major problems: noisy labels and imbalanced training data. Previous works focus more on reducing wrongly labeled relations (false positives) while few explore the missing relations that are caused by incompleteness.
4
0
0