Deep Learning for Text Style Transfer: A Survey

Authors

  • Di Jin MIT (work completed during his PhD at MIT)
  • Zhijing Jin Max Planck Institute for Intelligent Systems, Tübingen, Germany
  • Zhiting Hu UC San Diego
  • Olga Vechtomova University of Waterloo
  • Rada Mihalcea University of Michigan

Abstract

Text style transfer is a core task in natural language generation (NLG), which aims to control certain attributes in the generated text, such as politeness, emotion, humor, and many others. It has a long history in the field of natural language processing (NLP), but recently it has gained significant attention thanks to the promising performance brought by deep learning models. In this paper, we present a systematic survey of the research done on neural text style transfer. We have collected, summarized, and discussed more than 70 representative articles since the first neural text style transfer work in 2017. Overall, we have covered the task formulation, existing datasets and metrics for model development and evaluation, and all methods developed over the last several years. We find that existing methods are indeed based on a combination of several loss functions, with each function serving a certain goal. The distinct perspective we provide can shed light on the design of new methods. We conclude our survey with a discussions on open issues that need to be resolved for better future development.

Published

2024-12-05

Issue

Section

Survey article