
AI
Natural Language Processing
GPT-3: Language Models are Few-Shot Learners
- Frank
- June 2, 2020
- AI
- artificial intelligence
- Arxiv
- attention
- autoregressive
- Bert
- boolq
- common crawl
- context
- corpus
- deep language
- Deep Learning
- explained
- Few Shot
- glue
- GPT-2
- gpt-3
- gpt2
- gpt3
- heads
- language model
- Machine Learning
- Math
- Microsoft
- mlm
- Natural Language Processing
- natural questions
- Neural Networks
- news
- NLP
- OpenAI
- Paper
- preplexity
- question answering
- sota
- strings
- superglue
- training data
- Transformers
- turing
- Wikipedia
- zero shot
How far can you go with ONLY language modeling? Can a large enough language model perform NLP task out of the box? OpenAI take on these and other questions by training a transformer that is an order of magnitude larger than anything that has ever been built before and the results are astounding. Yannic Kilcher […]
Read More
AI
Generative AI
Making the Mona Lisa Come to Life with AI
Two Minute Papers shows off the fascinating work in the paper “Few-Shot Adversarial Learning of Realistic Neural Talking Head Models.” Could this be the next evolution of GANs? It certainly will empower a whole new wave of deep fakes. What a time to be alive, indeed!
Read More
AI
Generative AI
AI Image Translation
Two Minute Papers examines the paper “Few-Shot Unsupervised Image-to-Image Translation” Demo the technology at https://nvlabs.github.io/FUNIT/petswap.html
Read More