Published on Mon Apr 15 2019

Latent Code and Text-based Generative Adversarial Networks for Soft-text Generation

Md. Akmal Haidar, Mehdi Rezagholizadeh, Alan Do-Omri, Ahmad Rashid

Soft-GAN is a novel approach to exploit GAN setup for text generation. Autoencoders can be used for providing a continuous representation of sentences. We also propose hybrid latent code and text-based GAN (LATEXT-GAN) approaches.

0
0
0
Abstract

Text generation with generative adversarial networks (GANs) can be divided into the text-based and code-based categories according to the type of signals used for discrimination. In this work, we introduce a novel text-based approach called Soft-GAN to effectively exploit GAN setup for text generation. We demonstrate how autoencoders (AEs) can be used for providing a continuous representation of sentences, which we will refer to as soft-text. This soft representation will be used in GAN discrimination to synthesize similar soft-texts. We also propose hybrid latent code and text-based GAN (LATEXT-GAN) approaches with one or more discriminators, in which a combination of the latent code and the soft-text is used for GAN discriminations. We perform a number of subjective and objective experiments on two well-known datasets (SNLI and Image COCO) to validate our techniques. We discuss the results using several evaluation metrics and show that the proposed techniques outperform the traditional GAN-based text-generation methods.

Sun Sep 24 2017
Artificial Intelligence
Long Text Generation via Adversarial Training with Leaked Information
LeakGAN is highly effective in long text generation and also improves the performance in short text generation scenarios. Without anysupervision, LeakGAN would be able to implicitly learn sentence structures only through the interaction between Manager and Worker.
0
0
0
Tue Apr 23 2019
NLP
TextKD-GAN: Text Generation using KnowledgeDistillation and Generative Adversarial Networks
Text generation is of particular interest in many NLP applications such as machine translation, language modeling, and text summarization. We demonstrate how autoencoders (AEs) can be used for providing a continuous representation of sentences. We then train the generator to synthesize similar smooth representations.
0
0
0
Wed Sep 01 2021
NLP
OptAGAN: Entropy-based finetuning on text VAE-GAN
The Optimus and GANs combination avoids the troublesome application of GAns to discrete domain of text. We finetune using reinforcement learning by exploiting the structure of GPT-2 and by adding entropy-based rewards to balance between quality and diversity.
2
0
1
Mon Jun 12 2017
Machine Learning
Adversarial Feature Matching for Text Generation
The Generative Adversarial Network (GAN) has achieved great success in generating realistic synthetic data. We propose a framework for generating realistic text via adversarial training.
0
0
0
Fri Sep 28 2018
Artificial Intelligence
SALSA-TEXT : self attentive latent space based adversarial text generation
Adversarial latent code-based text generation has recently gained a lot of attention due to their promising results. In this paper, we take a step to fortify the architecturesused in these setups, specifically AAE and ARAE.
0
0
0
Tue Apr 07 2020
Machine Learning
TextGAIL: Generative Adversarial Imitation Learning for Text Generation
Generative Adversarial Networks (GANs) for text generation have recently received many criticisms, as they perform worse than their MLE counterparts. We propose a generative adversarial imitation learning framework for text generation that uses large pre-trained language models.
0
0
0