Modern Hopfield Networks

Activity: Talk or presentationInvited talkscience-to-science

Description

We propose a new paradigm for deep learning by equipping each layer of a deep learning architecture with modern Hopfield networks. The new paradigm is a new powerful concept comprising functionalities like pooling, memory, and attention for each layer. Associative memories date back to the 1960/70s and became popular through Hopfield Networks in 1982. Recently, we saw a renaissance of Hopfield Networks, the modern Hopfield Networks, with a tremendously increased storage capacity and an extremely fast convergence. We generalize modern Hopfield Networks with exponential storage capacity to continuous patterns. Their update rule ensures global convergence to local energy minima and they converge in one update step with exponentially low error. Surprisingly, the transformer attention mechanism is equal to the update rule of our new modern Hopfield Network with continuous states. The new modern Hopfield network can be integrated into deep learning architectures as layers to allow the storage of and access to raw input data, intermediate results, or learned prototypes. These Hopfield layers enable new ways of deep learning, beyond fully-connected, convolutional, or recurrent networks, and provide pooling, memory, association, and attention mechanisms. We demonstrate the broad applicability of the Hopfield layers across various domains. Hopfield layers improved state-of-the-art on three out of four considered multiple instance learning problems as well as on immune repertoire classification with several hundreds of thousands of instances. On the UCI benchmark collections of small classification tasks, where deep learning methods typically struggle, Hopfield layers yielded a new state-of-the-art when compared to different machine learning methods. Finally, Hopfield layers achieved state-of-the-art on two drug design datasets.
Period30 Jul 2021
Event titleInternational Conference on Machine Vision and Machine Learning (MVML21)
Event typeConference
LocationAustriaShow on map

Fields of science

  • 101031 Approximation theory
  • 102 Computer Sciences
  • 305901 Computer-aided diagnosis and therapy
  • 102033 Data mining
  • 102032 Computational intelligence
  • 101029 Mathematical statistics
  • 102013 Human-computer interaction
  • 305905 Medical informatics
  • 101028 Mathematical modelling
  • 101027 Dynamical systems
  • 101004 Biomathematics
  • 101026 Time series analysis
  • 202017 Embedded systems
  • 101024 Probability theory
  • 305907 Medical statistics
  • 102019 Machine learning
  • 202037 Signal processing
  • 102018 Artificial neural networks
  • 103029 Statistical physics
  • 202036 Sensor systems
  • 202035 Robotics
  • 106005 Bioinformatics
  • 106007 Biostatistics
  • 101019 Stochastics
  • 101018 Statistics
  • 101017 Game theory
  • 101016 Optimisation
  • 102001 Artificial intelligence
  • 101015 Operations research
  • 102004 Bioinformatics
  • 101014 Numerical mathematics
  • 102003 Image processing

JKU Focus areas

  • Digital Transformation