Published on Tue Sep 08 2020

Simple is Better! Lightweight Data Augmentation for Low Resource Slot Filling and Intent Classification

Samuel Louvan, Bernardo Magnini

Neural-based models have achieved outstanding performance on slot filling and intent classification. However, as new domains are frequently added, creating sizeable data is expensive. We show that lightweight augmentation, a set of augmentation methods, alleviates data scarcity problems.

0
0
0
Abstract

Neural-based models have achieved outstanding performance on slot filling and intent classification, when fairly large in-domain training data are available. However, as new domains are frequently added, creating sizeable data is expensive. We show that lightweight augmentation, a set of augmentation methods involving word span and sentence level operations, alleviates data scarcity problems. Our experiments on limited data settings show that lightweight augmentation yields significant performance improvement on slot filling on the ATIS and SNIPS datasets, and achieves competitive performance with respect to more complex, state-of-the-art, augmentation approaches. Furthermore, lightweight augmentation is also beneficial when combined with pre-trained LM-based models, as it improves BERT-based joint intent and slot filling models.

Thu Apr 30 2020
NLP
Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection
Detecting the user's intent and finding the corresponding slots among the words are important tasks in natural language understanding. Recently, advances in pre-trained language models such as ELMo and BERT have revolutionized the field.
0
0
0
Thu Feb 28 2019
NLP
BERT for Joint Intent Classification and Slot Filling
Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability. Recently a new language representation model, BERT, has been developed.
0
0
0
Mon Apr 05 2021
NLP
Intent Detection and Slot Filling for Vietnamese
Intent detection and slot filling are important tasks in spoken and natural language understanding. Vietnamese is a low-resource language in these research topics. We present the first public intent detection and slot filling dataset for Vietnamese. We also propose a joint model for intent detection and filling.
0
0
0
Thu Aug 26 2021
Artificial Intelligence
SLIM: Explicit Slot-Intent Mapping with BERT for Joint Multi-Intent Detection and Slot Filling
Utterance-level intent detection and token-level slot filling are two key tasks for natural language understanding (NLU) Most approaches assume that only a single intent exists in an utterance. In this paper, we propose a multi-intent NLU framework, called SLIM.
0
0
0
Tue Sep 06 2016
NLP
Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
Attention-based encoder-decoder neural network models have recently shown promising results in machine translation and speech recognition. We propose an attention-based neural network model for joint intent detection and slot filling, both of which are critical steps for many speech Understanding systems.
0
0
0
Wed Jan 20 2021
NLP
A survey of joint intent detection and slot-filling models in natural language understanding
Joint models for intent classification and slot filling have achieved state-of-the-art performance. This article is a compilation of past work in natural language understanding. We describe trends,approaches, issues, data sets and evaluation metrics.
0
0
0