BART
Implemented By:
-
TR Tobias Rohdetobiasr@allenai.org
Allen Institute for Artificial Intelligence -
Dirk Groenevelddirkg@allenai.org
Allen Institute for Artificial Intelligence -
Pete Walshpetew@allenai.org
Allen Institute for Artificial Intelligence
Description:
This is an implementation of the BART model from the paper BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. It uses a language modeling head and thus can be used for text generation.
This model is maintained by the AllenNLP team and its contributors on the AllenNLP models repository at https://github.com/allenai/allennlp-models.
- Related Papers:
-
Tags:
- generation
- transformers
- summarization
- AllenNLP Version: >=1.0
- Languages: en
- Datasets:
- Submitted On Mar 31, 2021