Hi Tomer,
Me and my partner Andi, would like to present the following paper about
"BART Denoising Sequence-to-Sequence Pre-training for Natural"
Article description:
BART performs natural language generation, comprehension, and translation tasks by
using noising transformations for pre-training, to get optimal results.
Article year of publication: 2019
Article source: https://arxiv.org/abs/1910.13461
GitHub Implementation link: https://github.com/facebookresearch/fairseq/tree/main/examples/bart
Hi On and Andi, approved.