BERT- Pre-training of Deep Bidirectional Transformers for Language Understanding
Title of paper - BERT- Pre-training of Deep Bidirectional Transformers for Language Understanding
This is a brief summary of paper for me to study and organize it, BERT- Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., NAACL 2019) I read and studied.
[Read More]