
AI
Research
Explaining the Paper: Hopfield Networks is All You Need
-
- August 10, 2020
- AI
- artificial intelligence
- Arxiv
- attention
- attention is all you need
- Bert
- binary
- continuous
- Deep Learning
- energy function
- error
- explained
- exponental
- gru
- hochreiter
- hopfield
- hopfield network
- key
- lse
- LSTM
- Machine Learning
- metastable
- Neural Networks
- Paper
- pattern
- query
- retrieval
- RNN
- routing
- Schmidhuber
- separation
- store
- transformer
- update rule
- value
Yannic Kilcher explains the paper “Hopfield Networks is All You Need.” Hopfield Networks are one of the classic models of biological memory networks. This paper generalizes modern Hopfield Networks to continuous states and shows that the corresponding update rule is equal to the attention mechanism used in modern Transformers. It further analyzes a pre-trained BERT […]
Read More
AI
Machine Learning
Object-Centric Learning with Slot Attention
-
- July 1, 2020
- AI
- artificial intelligence
- Arxiv
- attention
- attention mechanism
- capsules
- clevr
- CNN
- Convolutional Neural Network
- Deep Learning
- detr
- disentanglement
- embeddings
- encoder
- ethz
- explained
- gru
- LSTM
- Machine Learning
- Neural Networks
- objects
- Paper
- permutation invariant
- render
- routing
- set
- slots
- tetris
- transformer
- vision
- weight sharing
Visual scenes are often comprised of sets of independent objects. Yet, current vision models make no assumptions about the nature of the pictures they look at. Yannic Kilcher explore a paper on object-centric learning. By imposing an objectness prior, this paper a module that is able to recognize permutation-invariant sets of objects from pixels in […]
Read More