
AlphaCode Explained: AI Code Generation
- Frank
- February 15, 2022
- AI
- AI AlphaCode
- AI Codex
- Alpha Code
- AlphaCode
- AlphaCode AI
- artificial intelligence
- attention
- billion
- billion parameter
- codex
- codex ai
- Deep Mind
- DeepMind
- explained
- Explanation
- gopher explained
- Gopher Model
- GPT-2
- gpt-3
- gpt-4
- language modeling
- Large Language Model
- logic reasoning
- Machine Learning
- ML
- Natural Language Processing
- NLP
- OpenAI
- OpenAI Codex
- Retro
- self attention
- Self-Attention
- Self-Supervised Learning
- sota
- State of the Art
- Text Generation
- transformer
AlphaCode is DeepMind’s new massive language model for generating code. It is similar to OpenAI Codex, except for in the paper they provide a bit more analysis. The field of NLP within AI and ML has exploded get a lot more papers all the time. Hopefully this video can help you understand how AlphaCode works […]
Read More
Geometric Deep Learning: The Erlangen Programme of ML
- Frank
- July 6, 2021
- AI
- artificial intelligence
- Cancer
- CNN
- Computer Graphics
- Computer Vision
- Convolutional Neural Networks
- Deep Learning
- drug design
- equivariance
- erlangen program
- geometric deep learning
- Geometry
- GNN
- graph learning
- graph neural networks
- group theory
- hyperfoods
- immunotherapy
- invariance
- Machine Learning
- manifld learning
- Neural Network
- positional encoding
- Proteins
- symmetry
- transformer
- Transformers
The ICLR 2021 Keynote “Geometric Deep Learning: The Erlangen Programme of ML“ by Michael Bronstein is presented below.
Read More
GLOM: How to represent part-whole hierarchies in a neural network (Geoff Hinton’s Paper Explained)
- Frank
- March 3, 2021
- AI
- artificial intelligence
- Arxiv
- attention mechanism
- Capsule Networks
- capsule networks explained
- column
- Computer Vision
- consensus algorithm
- Deep Learning
- deep learning tutorial
- explained
- Geoff Hinton
- geoff hinton capsule networks
- geoff hinton neural networks
- Geoffrey Hinton
- geoffrey hinton deep learning
- geoffrey hinton glom
- glom model
- Google AI
- Google Brain
- hinton glom
- introduction to deep learning
- Machine Learning
- Neural Networks
- Schmidhuber
- transformer
Yannic Kilcher covers a paper where Geoffrey Hinton describes GLOM, a Computer Vision model that combines transformers, neural fields, contrastive learning, capsule networks, denoising autoencoders and RNNs. GLOM decomposes an image into a parse tree of objects and their parts. However, unlike previous systems, the parse tree is constructed dynamically and differently for each input, […]
Read More
Transformers for Image Recognition at Scale
- Frank
- October 6, 2020
- AI
- andrej karpathy
- anonymous
- artificial intelligence
- Arxiv
- attention is all you need
- attention mechanism
- beyer
- big transfer
- bit
- CNN
- Convolutional Neural Network
- Data Science
- Deep Learning
- explained
- Google Brain
- google research
- iclr
- iclr 2021
- karpathy
- Machine Learning
- Neural Networks
- Paper
- peer review
- review
- TPU
- tpu v3
- transformer
- transformer computer vision
- transformer images
- under submission
- vaswani
- vision transformer
- visual transformer
- vit
Yannic Kilcher explains why transformers are ruining convolutions. This paper, under review at ICLR, shows that given enough data, a standard Transformer can outperform Convolutional Neural Networks in image recognition tasks, which are classically tasks where CNNs excel. In this Video, I explain the architecture of the Vision Transformer (ViT), the reason why it works […]
Read More
14 Cool Apps Built on OpenAI’s GPT-3 API
- Frank
- July 29, 2020
- gpt-3
- OpenAI
- transformer
Bakz T. Future shows off 14 Cool applications built on top of OpenAI’s GPT-3 (general purpose transformer) API (currently in private beta).
Read More