
Generative AI
TransGAN: Two Transformers Can Make One Strong GAN
- Frank
- February 19, 2021
- AI
- artificial intelligence
- Arxiv
- attention is all you need
- attention mechanism
- attention neural networks
- Deep Learning
- deep learning explained
- generative adversarial network
- local attention
- Machine Learning
- machine learning explained
- multihead attention
- Neural Networks
- paper explained
- pixelshuffle
- self attention
- superresolution
- transformer gan
- transformer gans
- transformer generative adversarial network
- transformer generator
- transgan
- vision transformer
Generative Adversarial Networks (GANs) hold the state-of-the-art when it comes to image generation. However, while the rest of computer vision is slowly taken over by transformers or other attention-based architectures, all working GANs to date contain some form of convolutional layers. This paper changes that and builds TransGAN, the first GAN where both the generator […]
Read More