This is a brief summary of paper for me to study and organize it, BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension (Lewis et al., ACL 2020) that I read and studied.

Tho following is the material of my paper seminar on BART which is composed by me.

It consists of two types of presentation, 1) detailed version presentation and 2) short version presentation.

The below has video of the author presentation.

I hope someone who want to understand what is the BART and pre-training in natural language processing field

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension (Lewis et al., ACL 2020)

For detailed experiment analysis, you can found in BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension (Lewis et al., ACL 2020)

Reference