Controlling the behavior of language models (LMs) without re-training is a major open problem in natural language generation
non-Autoregressive language model based on continuous diffusions
substantial departure from the current paradigm of discrete Autoregressive generation
iteratively denoises a sequence of Gaussian vectors into word vectors, yielding a sequence of intermediate latent variables
continuous, hierarchical nature of these intermediate variables enables a simple gradient-based algorithm to perform complex, controllable generation tasks
successful control of Diffusion-LM for six challenging fine-grained control tasks