Long and Diverse Text Generation with Planning-based Hierarchical Variational Model
Introduction
Existing neural methods for data-to-text generation are still struggling to produce long and diverse texts: they are insufficient to model input data dynamically during generation, to capture inter-sentence coherence, or to generate diversified expressions. To address these issues, we propose a Planning-based Hierarchical Variational Model (PHVM). Our model first plans a sequence of groups (each group is a subset of input items to be covered by a sentence) and then realizes each sentence conditioned on the planning result and the previously generated context, thereby decomposing long text generation into dependent sentence generation sub-tasks. To capture expression diversity, we devise a hierarchical latent structure where a global planning latent variable models the diversity of reasonable planning and a sequence of local latent variables controls sentence realization.
This project is a Tensorflow implementation of our work.
Requirements
- Python 3.6
- Numpy
- Tensorflow 1.4.0
Quick Start
Citation
Our paper is available at https://arxiv.org/abs/1908.06605v2.
Please kindly cite our paper if this paper and the code are helpful.