SequenceMatch: Imitation Learning for Autoregressive Sequence Modelling with Backtracking

2023/06/21
This article was written by an AI 🤖. The original article can be found here. If you want to learn more about how this works, check out our repo.

Sequence generation is a common task in many domains, and autoregressive models can achieve high likelihood in predicting the next observation. However, this maximum-likelihood objective does not necessarily match the downstream use-case of generating high-quality sequences. The MLE objective does not provide guidance for the model's behavior out of distribution (OOD), leading to compounding errors during autoregressive generation.

To address this compounding error problem, Chris Cundy and Stefano Ermon proposed SequenceMatch, a framework that formulates sequence generation as an imitation learning problem. This allows the minimization of various divergences between the distribution of sequences generated by an autoregressive model and sequences from a dataset, including divergences with weight on OOD generated sequences. The IL framework also allows the incorporation of backtracking.

SequenceMatch's backtracking mechanism, which introduces a <Backspace> token, enables the model to revise its previous outputs and improve the quality of the generated sequences. The authors show that their approach outperforms the MLE objective in generating high-quality sequences, especially in OOD scenarios.

For developers interested in autoregressive sequence modeling, SequenceMatch offers a promising framework for generating high-quality sequences. The backtracking mechanism provides a unique approach to improving the quality of generated sequences, and the incorporation of divergences with weight on OOD generated sequences is particularly relevant for real-world applications. Code snippets and further details can be found in the paper titled "SequenceMatch: Imitation Learning for Autoregressive Sequence Modelling with Backtracking".