« Back to all Projects

BART


Implemented By:

Description:

This is an implementation of the BART model from the paper BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. It uses a language modeling head and thus can be used for text generation.

This model is maintained by the AllenNLP team and its contributors on the AllenNLP models repository at https://github.com/allenai/allennlp-models.