
Is This the Worst AI Ever?
- Frank
- June 13, 2022
- 4chan
- 4chan ai
- 4chan bot
- 4chan pol
- 4chan pol bot
- AI
- ai bias
- artificial intelligence
- Arxiv
- Deep Learning
- eleuther ai
- explained
- gpt 3
- gpt 4
- gpt 4chan
- gpt j
- gpt-3
- gpt-3 truthful
- gpt-4
- gpt-4chan
- gpt-j
- gpt-j-6b
- gpt4
- gpt4chan
- is ai truthful
- language model evaluation
- Machine Learning
- Natural Language Processing
- Neural Networks
- Paper
- seychelle
- seychelle bot
- seychelles
- truthful qa
- truthfulqa
- trutufulqa dataset
- Turing Test
GPT-4chan was trained on over 3 years of posts from 4chan’s “politically incorrect” (/pol/) board. (and no, this is not GPT-4) You can imagine what it learned. Maybe we need to be better people so that we can make sure our AI overlords will have better behavior to model.
Read More
Ask the AI: Linux or Windows?
- Frank
- June 13, 2022
- AI
- Fun
- gpt-3
Dave asks the GPT-3 to tell us stories about epic battles between Linux and Windows users, plus we ask it to write prime sieve code in multiple languages.
Read More
DeepMind’s New AI Thinks It Is A Genius! 🤖
Two Minute Papers checks out the paper “DeepMind Gopher – Scaling Language Models: Methods, Analysis & Insights from Training Gopher”
Read More
OpenAI’s New AI Writes A Letter To Humanity
Two Minute Papers explores GPT-3’s Edit and Insert capabilities.
Read More
Is OpenAI’s AI As Smart As A University Student? 🤖
Two Minute Papers examines the following papers: Grade school math: https://openai.com/blog/grade-school-math/ University level math: https://arxiv.org/abs/2112.15594 Olympiad: https://openai.com/blog/formal-math/
Read More
AlphaCode Explained: AI Code Generation
- Frank
- February 15, 2022
- AI
- AI AlphaCode
- AI Codex
- Alpha Code
- AlphaCode
- AlphaCode AI
- artificial intelligence
- attention
- billion
- billion parameter
- codex
- codex ai
- Deep Mind
- DeepMind
- explained
- Explanation
- gopher explained
- Gopher Model
- GPT-2
- gpt-3
- gpt-4
- language modeling
- Large Language Model
- logic reasoning
- Machine Learning
- ML
- Natural Language Processing
- NLP
- OpenAI
- OpenAI Codex
- Retro
- self attention
- Self-Attention
- Self-Supervised Learning
- sota
- State of the Art
- Text Generation
- transformer
AlphaCode is DeepMind’s new massive language model for generating code. It is similar to OpenAI Codex, except for in the paper they provide a bit more analysis. The field of NLP within AI and ML has exploded get a lot more papers all the time. Hopefully this video can help you understand how AlphaCode works […]
Read More
GPT-3 Crash Course Part 1
- Frank
- January 20, 2022
- Descript
- gpt-3
- OpenAI
This is a video about GPT-3 Crash Course, brief into and history of OpenAi
Read More
Challenges of Training Large-Scale Neural Networks
Here’s an interesting look at the unique challenges that present themselves when training large neural networks.
Read More
[ML News] New ImageNet SOTA | Uber’s H3 hexagonal coordinate system | New text-image-pair dataset
Yannic provides the latest news in machine learning in this video. Time Stamps: 0:00 – Intro 0:20 – TruthfulQA benchmark shines new light on GPT-3 2:00 – LAION-400M image-text-pair dataset 4:10 – GoogleAI’s EfficientNetV2 and CoAtNet 6:15 – Uber’s H3: A hexagonal coordinate system 7:40 – AWS NeurIPS 2021 DeepRacer Challenge 8:15 – Helpful Libraries […]
Read More