Published on Fri May 21 2021

Semantic Representation for Dialogue Modeling

Xuefeng Bai, Yulong Chen, Linfeng Song, Yue Zhang

We exploit Abstract Meaning Representation (AMR) to help dialogue modeling. Compared with the textual input, AMR explicitly provides core semantic knowledge. We develop an algorithm to construct dialogue-level AMR graphs. Experimental results show the superiority of our model.

6
2
7
Abstract

Although neural models have achieved competitive results in dialogue systems, they have shown limited ability in representing core semantics, such as ignoring important entities. To this end, we exploit Abstract Meaning Representation (AMR) to help dialogue modeling. Compared with the textual input, AMR explicitly provides core semantic knowledge and reduces data sparsity. We develop an algorithm to construct dialogue-level AMR graphs from sentence-level AMRs and explore two ways to incorporate AMRs into dialogue systems. Experimental results on both dialogue understanding and response generation tasks show the superiority of our model. To our knowledge, we are the first to leverage a formal semantic representation into neural dialogue modeling.

Wed Oct 02 2019
NLP
Hierarchical Multi-Task Natural Language Understanding for Cross-domain Conversational AI: HERMIT NLU
We present a new neural architecture for wide-coverage Natural Language Understanding in Spoken Dialogue Systems. The architecture is a hierarchy of self-attention mechanisms and BiLSTM encoders followed by CRF tagging layers.
0
0
0
Thu Sep 17 2020
Artificial Intelligence
Structured Attention for Unsupervised Dialogue Structure Induction
0
0
0
Sat May 12 2018
NLP
Coarse-to-Fine Decoding for Neural Semantic Parsing
Semantic parsing aims at mapping natural language utterances into structured representations. We propose a structure-aware neural architecture which decomposes the semantic parsing process into two stages.
0
0
0
Mon Apr 24 2017
NLP
Learning Symmetric Collaborative Dialogue Agents with Dynamic Knowledge Graph Embeddings
We study a symmetric collaborative dialogue setting in which two agents must strategically communicate to achieve a common goal. The open-ended dialogue state in this setting poses new challenges for existing dialogue systems. We propose a neural model with dynamic knowledge graph embeddings that evolve as the dialogue progresses.
0
0
0
Mon May 10 2021
NLP
Recent Advances in Deep Learning Based Dialogue Systems: A Systematic Survey
dialogue systems are a popular Natural Language Processing (NLP) task. In this survey, we mainly focus on the deep learning-based dialogue systems. We comprehensively review state-of-the-art research outcomes in dialogue systems and analyze them from two angles: model type and system type.
5
1
1
Tue Apr 21 2020
NLP
DIET: Lightweight Language Understanding for Dialogue Systems
Large-scale pre-trained language models have shown impressive results on language understanding benchmarks like GLUE and SuperGLUE. We introduce the Dual Intent and Entity Transformer (DIET) architecture, and study the effectiveness of different pre- trained representations on intent and entity prediction.
0
0
0
Mon Jun 12 2017
NLP
Attention Is All You Need
The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms. Experiments on two machine translation tasks show these models to be superior in
50
215
883
Thu Oct 11 2018
NLP
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
BERT is designed to pre-train deep                bidirectional representations from unlabeled text. It can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks.
13
8
15
Mon Sep 01 2014
NLP
Neural Machine Translation by Jointly Learning to Align and Translate
Neural machine translation is a recently proposed approach to machine translation. Unlike traditional statistical machine translation, the neural machine translation aims at building a single neural network that can be tuned to maximize translation performance.
6
4
7
Thu Jul 21 2016
Machine Learning
Layer Normalization
Training state-of-the-art, deep neural networks is computationally expensive. One way to reduce the training time is to normalize the activities of the neurons. A recently introduced technique called batch normalization uses the distribution of the summed input to a neuron over a mini-batch.
1
0
2
Wed Dec 30 2020
NLP
Improving BERT with Syntax-aware Local Attention
Pre-trained Transformer-based neural language models, such as BERT, have achieved remarkable results on varieties of NLP tasks. We propose a syntax-aware local attention, where the attention scopes are restrained based on the distances in the syntactic structure.
2
1
1
Fri Jun 19 2015
NLP
A Neural Conversational Model
Conversational modeling is an important task in natural language understanding and machine intelligence. Previous approaches are often restricted to specific domains and require hand-crafted rules. We present a simple approach which uses the recently proposed sequence to sequence framework.
1
1
1