Published on Tue Apr 21 2020

Keyphrase Generation with Cross-Document Attention

Shizhe Diao, Yan Song, Tong Zhang

Keyphrase generation aims to produce a set of phrases summarizing the essentials of a given document. In this paper, we propose CDKGen, a Transformer-based keyphrase generator. We also adopt a copy mechanism to enhance our model via selecting appropriate words from documents.

0
0
0
Abstract

Keyphrase generation aims to produce a set of phrases summarizing the essentials of a given document. Conventional methods normally apply an encoder-decoder architecture to generate the output keyphrases for an input document, where they are designed to focus on each current document so they inevitably omit crucial corpus-level information carried by other similar documents, i.e., the cross-document dependency and latent topics. In this paper, we propose CDKGen, a Transformer-based keyphrase generator, which expands the Transformer to global attention with cross-document attention networks to incorporate available documents as references so as to generate better keyphrases with the guidance of topic information. On top of the proposed Transformer + cross-document attention architecture, we also adopt a copy mechanism to enhance our model via selecting appropriate words from documents to deal with out-of-vocabulary words in keyphrases. Experiment results on five benchmark datasets illustrate the validity and effectiveness of our model, which achieves the state-of-the-art performance on all datasets. Further analyses confirm that the proposed model is able to generate keyphrases consistent with references while keeping sufficient diversity. The code of CDKGen is available at https://github.com/SVAIGBA/CDKGen.

Sun Apr 23 2017
NLP
Deep Keyphrase Generation
Keyphrase provides highly-condensed information that can be effectively used for understanding, organizing and retrieving text content. We propose a generative model for keyphrase prediction with an encoder-decoder framework. The model attempts to capture the deep semantic meaning of the text with a deep learning method.
2
0
0
Tue Aug 04 2020
NLP
Select, Extract and Generate: Neural Keyphrase Generation with Layer-wise Coverage Attention
SEG-Net uses Transformer, a self-attentive architecture, as the basic building block. The experimental results on seven keyphrase generation benchmarks from scientific and web documents demonstrate that SEG- net outperforms the state-of-the-art neural generative methods.
2
0
0
Sun May 19 2019
NLP
DivGraphPointer: A Graph Pointer Network for Extracting Diverse Keyphrases
Keyphrase extraction from documents is useful to a variety of applications. This paper presents an end-to-end method for extracting a set of diversified keyphrases from a document.
0
0
0
Thu May 06 2021
NLP
SGG: Learning to Select, Guide, and Generate for Keyphrase Generation
Select-Guide-Generate (SGG) approach is proposed to deal with present and absent keyphrase generation separately with different mechanisms. SGG is a hierarchical neural network which consists of a pointing-based selector at low layer and a selection-guided generator at high layer.
8
1
18
Mon Jun 10 2019
NLP
Topic-Aware Neural Keyphrase Generation for Social Media Language
A huge volume of user-generated content is daily produced on social media. We propose asequence-to-sequence (seq2seq) based neural keyphrase generation framework. Our model, being topic-aware, allows joint modeling of corpus-level latent topicrepresentations.
0
0
0
Tue Sep 17 2019
Machine Learning
BSDAR: Beam Search Decoding with Attention Reward in Neural Keyphrase Generation
This study mainly investigates two decoding problems in neural keyphrase generation. We introduce an extension of beam search inference based on word-level and n-gram level attention score. Results show that our proposed solution can overcome the algorithm bias to shorter and nearly identical sequences.
0
0
0