Large unlabeled text corpora로 학습시킨 generative pre-trained model인 GPT 논문을 리뷰해봤다.
Improving Language Understanding by Generative Pre-Training | Notion
1. Introduction
ruddy-sheet-75d.notion.site
정리한 내용 중 오류가 있다면 댓글로 알려주시면 감사하겠습니다!
'논문리뷰 > Natural Language Processing' 카테고리의 다른 글
Language Models are Unsupervised Multitask Learners (0) | 2024.02.03 |
---|---|
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (0) | 2024.02.03 |
Deep contextual word representations (0) | 2024.02.03 |
Neural Machine Translation by Jointly Learning to Align and Translate (0) | 2024.01.24 |
Attention Is All You Need (0) | 2023.09.19 |