Author : Paolo Tirotta
Affiliation : University of Bologna
Country : Italy
Category : Computer Science & Information Technology
Volume, Issue, Month, Year : 11, 23, December, 2021
Abstract :
Transfer learning through large pre-trained models has changed the landscape of current applications in natural language processing (NLP). Recently Optimus, a variational autoencoder (VAE) which combines two pre-trained models, BERT and GPT-2, has been released, and its combination with generative adversarial networks (GANs) has been shown to produce novel, yet very human-looking text. The Optimus and GANs combination avoids the troublesome application of GANs to the discrete domain of text, and prevents the exposure bias of standard maximum likelihood methods. We combine the training of GANs in the latent space,
No comments:
Post a Comment