Published on Tue Dec 08 2020

Facts2Story: Controlling Text Generation by Key Facts

Eyal Orbach, Yoav Goldberg

We show that while auto-regressive, unidirectional Language Models such as GPT2 produce better fluency, they struggle to adhere to the requested facts. We propose a plan-and-cloze model (using fine-tuned XLNet) which produces competitive

0
0
0
Abstract

Recent advancements in self-attention neural network architectures have raised the bar for open-ended text generation. Yet, while current methods are capable of producing a coherent text which is several hundred words long, attaining control over the content that is being generated -- as well as evaluating it -- are still open questions. We propose a controlled generation task which is based on expanding a sequence of facts, expressed in natural language, into a longer narrative. We introduce human-based evaluation metrics for this task, as well as a method for deriving a large training dataset. We evaluate three methods on this task, based on fine-tuning pre-trained models. We show that while auto-regressive, unidirectional Language Models such as GPT2 produce better fluency, they struggle to adhere to the requested facts. We propose a plan-and-cloze model (using fine-tuned XLNet) which produces competitive fluency while adhering to the requested content.

Sat Jan 02 2021
NLP
On-the-Fly Attention Modularization for Neural Generation
Neural text generation still suffers from degeneration: generated text is repetitive, generic, self-inconsistent, and lacking commonsense. Our findings motivate on-the-fly attention modularization, a simple but effective method for injecting inductive biases into attention computation.
0
0
0
Sun May 13 2018
NLP
Hierarchical Neural Story Generation
We collect a large dataset of 300K human-written stories paired with writing prompts from an online forum. Our model first generates a premise, and then transforms it into a passage of text. Human judges prefer stories generated by our approach to those from a strong non-hierarchical model.
0
0
0
Wed Oct 14 2020
NLP
Positioning yourself in the maze of Neural Text Generation: A Task-Agnostic Survey
Neural text generation metamorphosed into several critical natural language applications ranging from text completion to free form narrative generation. In this context, we present an abstraction of the imperative techniques with respect to learning paradigms,pretraining, modeling approaches, decoding and the key challenges outstanding.
0
0
0
Sat Apr 06 2019
Artificial Intelligence
Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation
Data-to-text generation can be conceptually divided into two parts: ordering and structuring the information (planning) and generating fluent language describing the information. Modern neural generation systems conflate these two steps into a single end- to-end differentiable system.
0
0
0
Wed Jun 13 2018
Machine Learning
Generating Sentences Using a Dynamic Canvas
Attentive Unsupervised Text (W)riter (AUTR) is a word level generative model for natural language. It uses a recurrent neural network with a dynamic attention and canvas memory mechanism to construct sentences.
0
0
0
Thu Jun 10 2021
NLP
AGGGEN: Ordering and Aggregating while Generating
AGGGEN is a data-to-text model which introduces two explicit sentence planning stages. It performs sentence planning at the same time as generating text. It learns latent alignments between input representation and target text.
1
0
0