Published on Mon Sep 02 2019

Modeling Long-Range Context for Concurrent Dialogue Acts Recognition

Yue Yu, Siyao Peng, Grace Hui Yang

In dialogues, an utterance is a chain of consecutive sentences. Multiple dialogue acts (DA) for one utterance breeds complex dependencies across turns. DA recognition challenges a model's predictive power over long utterances and complex DA context.

0
0
0
Abstract

In dialogues, an utterance is a chain of consecutive sentences produced by one speaker which ranges from a short sentence to a thousand-word post. When studying dialogues at the utterance level, it is not uncommon that an utterance would serve multiple functions. For instance, "Thank you. It works great." expresses both gratitude and positive feedback in the same utterance. Multiple dialogue acts (DA) for one utterance breeds complex dependencies across dialogue turns. Therefore, DA recognition challenges a model's predictive power over long utterances and complex DA context. We term this problem Concurrent Dialogue Acts (CDA) recognition. Previous work on DA recognition either assumes one DA per utterance or fails to realize the sequential nature of dialogues. In this paper, we present an adapted Convolutional Recurrent Neural Network (CRNN) which models the interactions between utterances of long-range context. Our model significantly outperforms existing work on CDA recognition on a tech forum dataset.

Mon Oct 22 2018
NLP
A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification
Recognising dialogue acts (DA) is important for many natural language processing tasks such as dialogue generation and intention recognition. We propose a dual-attention hierarchical recurrent neural network for DA classification.
0
0
0
Thu Apr 04 2019
NLP
Dialogue Act Classification with Context-Aware Self-Attention
We leverage the effectiveness of a context-aware self-attention Mechanism coupled with a hierarchical recurrent neural network. We show a significant improvement over state-of-the-art results on the Switchboard Corpus. We also investigate the impact of different utterance-level representation learning methods.
0
0
0
Fri Aug 30 2019
NLP
Modeling Multi-Action Policy for Task-Oriented Dialogues
Longer conversations lead to more errors and the system needs to be more robust to handle them. A novel policy model is proposed based on a recurrent cell called gated Continue-Act-Slots.
0
0
0
Thu May 28 2020
NLP
Contextual Dialogue Act Classification for Open-Domain Conversational Agents
Classifying the general intent of the user utterance in a conversation, also known as Dialogue Act (DA), is a key step in Natural Language Understanding. DA classification has been extensively studied in human-human conversations, but it has not been sufficiently explored for open-domain automated conversational agents.
0
0
0
Sat Mar 12 2016
Neural Networks
Sequential Short-Text Classification with Recurrent and Convolutional Neural Networks
Recent approaches based on artificial neural networks have shown promising results for short-text classification. Most existing ANN-based systems do not leverage the preceding short texts when classifying a subsequent one. In this work, we present a model based on recurrent neural networks and convolutional neural networks that incorporates
0
0
0
Sat Jun 15 2013
NLP
Recurrent Convolutional Neural Networks for Discourse Compositionality
We introduce a sentence model and a discourse model corresponding to the two levels of compositionality. The sentence model adopts convolution as the central operation for composing semantic vectors. The discourse model is based on a recurrent neural network that is conditioned in a novel way on the current sentence and the current
0
0
0