
AI
Large Language Models
Research
Scaling Transformer to 1M tokens and beyond with RMT (Paper Explained)
Yannic Kilcher explains this paper that promises to scale transformers to 1 million tokens and beyond. We take a look at the technique behind it: The Recurrent Memory Transformer, and what its strengths and weaknesses are.
Read More
AI
Deep Learning
Deep Learning New Frontiers
- Frank
- March 29, 2020
- 6.s191
- 6s191
- adversarial attacks
- AI
- Alexander Amini
- amini
- artificial intelligence
- AutoML
- ava soleimany
- bayesian deep learning
- Computer Vision
- deep evidential regression
- Deep Learning
- deep learning basics
- deep learning python
- Deep Mind
- deeplearning
- evidential deep learning
- generalization
- graph neural networks
- Introduction
- lecture 2
- Machine Learning
- MIT
- mit deep learning
- Neural Networks
- OpenAI
- soleimany
- TensorFlow
- tensorflow tutorial
- what is deep learning
MIT Introduction to Deep Learning 6.S191: Lecture 6 with Ava Soleimany. Subscribe to stay up to date with new deep learning lectures at MIT, or follow us @MITDeepLearning on Twitter and Instagram to stay fully-connected!! Lecture Outline 0:00 – Introduction 0:58 – Course logistics 3:59 – Upcoming guest lectures 5:35 – Deep learning and expressivity […]
Read More