Published on Mon Apr 05 2021

Intent Detection and Slot Filling for Vietnamese

Mai Hoang Dao, Thinh Hung Truong, Dat Quoc Nguyen

Intent detection and slot filling are important tasks in spoken and natural language understanding. Vietnamese is a low-resource language in these research topics. We present the first public intent detection and slot filling dataset for Vietnamese. We also propose a joint model for intent detection and filling.

1
0
0
Abstract

Intent detection and slot filling are important tasks in spoken and natural language understanding. However, Vietnamese is a low-resource language in these research topics. In this paper, we present the first public intent detection and slot filling dataset for Vietnamese. In addition, we also propose a joint model for intent detection and slot filling, that extends the recent state-of-the-art JointBERT+CRF model with an intent-slot attention layer to explicitly incorporate intent context information into slot filling via "soft" intent label embedding. Experimental results on our Vietnamese dataset show that our proposed model significantly outperforms JointBERT+CRF. We publicly release our dataset and the implementation of our model at: https://github.com/VinAIResearch/JointIDSF

Sun Jun 30 2019
Artificial Intelligence
A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling
A spoken language understanding (SLU) system includes two main tasks, slot filling and intent detection. The joint model for the two tasks is becoming a tendency in SLU. But the bi-directional interrelated connections between the intent and slots are not established in the existing joint models.
0
0
0
Thu Apr 30 2020
NLP
Enriched Pre-trained Transformers for Joint Slot Filling and Intent Detection
Detecting the user's intent and finding the corresponding slots among the words are important tasks in natural language understanding. Recently, advances in pre-trained language models such as ELMo and BERT have revolutionized the field.
0
0
0
Thu Feb 28 2019
NLP
BERT for Joint Intent Classification and Slot Filling
Intent classification and slot filling are two essential tasks for natural language understanding. They often suffer from small-scale human-labeled training data, resulting in poor generalization capability. Recently a new language representation model, BERT, has been developed.
0
0
0
Thu Aug 26 2021
Artificial Intelligence
SLIM: Explicit Slot-Intent Mapping with BERT for Joint Multi-Intent Detection and Slot Filling
Utterance-level intent detection and token-level slot filling are two key tasks for natural language understanding (NLU) Most approaches assume that only a single intent exists in an utterance. In this paper, we propose a multi-intent NLU framework, called SLIM.
0
0
0
Thu Oct 08 2020
NLP
A Co-Interactive Transformer for Joint Slot Filling and Intent Detection
Intent detection and slot filling are two main tasks for building a spoken language understanding system. Previous studies either model the two tasks separately or only consider the single informationflow from intent to slot. In this paper, we propose a co-interactive module to consider the cross-impact between the
0
0
0
Fri Jul 05 2019
Machine Learning
Multi-lingual Intent Detection and Slot Filling in a Joint BERT-based Model
Intent Detection and Slot Filling are two pillar tasks in Spoken Natural Language Understanding. Common approaches adopt joint Deep Learning architectures in attention-based recurrent frameworks. We aim at exploiting the success of "recurrence-less" models for these tasks.
0
0
0
Thu Oct 11 2018
NLP
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
BERT is designed to pre-train deep                bidirectional representations from unlabeled text. It can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks.
13
8
15
Thu Apr 08 2021
NLP
COVID-19 Named Entity Recognition for Vietnamese
The current COVID-19 pandemic has lead to the creation of many corpora. Most of these corpora are exclusively for English. As the pandemic is a global problem, it is worth creating datasets for languages other than English.
1
1
13
Tue Nov 05 2019
NLP
Unsupervised Cross-lingual Representation Learning at Scale
This paper shows that pretraining multilingual language models at scale leads to significant performance gains for a wide range of cross-lingual transfer tasks. We train a Transformer-based masked language model on one hundredanguages, using more than two terabytes of filtered CommonCrawl data.
1
0
2
Mon Oct 05 2020
Artificial Intelligence
A Pilot Study of Text-to-SQL Semantic Parsing for Vietnamese
Semantic parsing is an important NLP task. Vietnamese is a low-resource language in this research area. We present the first public large-scale Text-to-SQL semantic parsing dataset for Vietnamese.
0
0
0
Fri Jul 26 2019
NLP
RoBERTa: A Robustly Optimized BERT Pretraining Approach
We present a replication study of BERT pretraining. We find that BERT was significantly undertrained. Our best model achieves state-of-the-art results on GLUE, RACE and SQuAD. We release our models and code.
2
0
0
Sun Jun 30 2019
Artificial Intelligence
A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling
A spoken language understanding (SLU) system includes two main tasks, slot filling and intent detection. The joint model for the two tasks is becoming a tendency in SLU. But the bi-directional interrelated connections between the intent and slots are not established in the existing joint models.
0
0
0