BART- Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
Title of paper - BART- Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
This is a brief summary of paper for me to study and organize it, BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension (Lewis et al., ACL 2020) that I read and studied.
[Read More]