일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
- review
- Stanford
- code
- github
- Vim
- terminal
- language_model
- cs231n
- natural_language_processing
- linux
- install
- pip
- seq2seq
- tab
- text
- paper_review
- nlp
- Standford
- computer_setting
- Ai
- computer
- Statistics
- deeplearning
- gensim
- git
- json
- cs224n
- slideshare
- error
- machinelearning
- Today
- Total
NLP/AI/Statistics
[Slideshare] ELECTRA: Pre-Training Text Encoders As Discriminators Rather Than Generators 본문
[Slideshare] ELECTRA: Pre-Training Text Encoders As Discriminators Rather Than Generators
Danbi Cho 2021. 5. 4. 11:26Title: Electra: pre-training text encoders as discriminators rather than generators
Authors: Clark, K., Luong, M.T., Le, Q.V., and Manning, C.D.
Published: In arXiv preprint arXiv:2003.10555, 2020.
Paper: arxiv.org/abs/2003.10555
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Masked language modeling (MLM) pre-training methods such as BERT corrupt the input by replacing some tokens with [MASK] and then train a model to reconstruct the original tokens. While they produce good results when transferred to downstream NLP tasks, the
arxiv.org
Slideshare: www.slideshare.net/DanbiCho2/electrapretraining-text-encoders-as-discriminators-rather-than-generators
ELECTRA_Pretraining Text Encoders as Discriminators rather than Gener…
The presentation explains the ELECTRA model. ELECTRA means 'Efficiently Learning an Encoder that Classifies Token Replacements Accurately'. This paper proposes…
www.slideshare.net