
Generative AI
TransGAN: Two Transformers Can Make One Strong GAN
- Frank
- February 19, 2021
- AI
- artificial intelligence
- Arxiv
- attention is all you need
- attention mechanism
- attention neural networks
- Deep Learning
- deep learning explained
- generative adversarial network
- local attention
- Machine Learning
- machine learning explained
- multihead attention
- Neural Networks
- paper explained
- pixelshuffle
- self attention
- superresolution
- transformer gan
- transformer gans
- transformer generative adversarial network
- transformer generator
- transgan
- vision transformer
Generative Adversarial Networks (GANs) hold the state-of-the-art when it comes to image generation. However, while the rest of computer vision is slowly taken over by transformers or other attention-based architectures, all working GANs to date contain some form of convolutional layers. This paper changes that and builds TransGAN, the first GAN where both the generator […]
Read More
AI
Machine Learning
NFNets: High-Performance Large-Scale Image Recognition Without Normalization
- Frank
- February 15, 2021
- AI
- artificial intelligence
- Arxiv
- batch norm
- batch normalization
- batchnorm
- best imagenet model
- best neural network
- Deep Learning
- deep learning code
- deep learning tutorial
- Deep Mind
- DeepMind
- distributed training
- explained
- gradient clipping
- imagenet
- JAX
- layer normalization
- Machine Learning
- machine learning explained
- machine learning tutorial
- mean shift
- ml code
- Neural Networks
- nfnet
- nfnets
- nfnets code
- nfresnet
- normalizer-free
- Paper
- weight standardization
Yannic Kilcher explains NFNets in this video.
Read More