NLP/AI/Statistics

[Slideshare] GPT models 본문

Paper Slide

[Slideshare] GPT models

Danbi Cho 2021. 6. 3. 12:51

GPT-1: Improving language understanding by generative pre-training

GPT-2: Language models are unsupervised multitask learners

GPT-3: Language models are few-shot learners

 

Slideshare: https://www.slideshare.net/DanbiCho2/gpt-models

 

Gpt models

I summarized the GPT models in this slide and compared the GPT1, GPT2, and GPT3. GPT means Generative Pre-Training of a language model and was implemented base…

www.slideshare.net

 

참고)

GPT-1: https://www.cs.ubc.ca/~amuham01/LING530/papers/radford2018improving.pdf

GPT-2: https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf

GPT-3: https://arxiv.org/pdf/2005.14165.pdf

 

https://greeksharifa.github.io/nlp(natural%20language%20processing)%20/%20rnns/2019/08/28/OpenAI-GPT-2-Language-Models-are-Unsupervised-Multitask-Learners/ 

 

 

Python, Machine & Deep Learning

Python, Machine Learning & Deep Learning

greeksharifa.github.io

https://greeksharifa.github.io/nlp(natural%20language%20processing)%20/%20rnns/2020/08/14/OpenAI-GPT-3-Language-Models-are-Few-Shot-Learners/ 

 

Python, Machine & Deep Learning

Python, Machine Learning & Deep Learning

greeksharifa.github.io

 

Comments