Multiplicative couplings facilitate rapid learning and information gating in recurrent neural networks

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

The mammalian forebrain is the seat of higher cognition with architectural parallels to modern machine learning systems. Specifically, the cortex resembles recurrent neural networks (RNNs) while the thalamus resembles feedforward neural networks (FNNs). How such architectural features endow the forebrain with its learning capacity, is unknown. Here we take inspiration from empirical thalamocortical discovery and develop a multiplicative coupling mechanism between RNN-FNN architectures that collectively enhance their computational strengths and learning. The multiplicative interaction imposes a Hebbian-weight amplification onto synaptic-neuronal coupling, enabling context-dependent gating and rapid switching. We demonstrate that multiplicative feedback-driven synaptic plasticity achieves 2-100 folds of speed improvement in supervised, reinforcement and unsupervised learning settings, boosting memory capacity, model robustness and generalization of RNNs. We further demonstrate the efficacy and biological plausibility of multiplicative gating in modeling multiregional circuits, including a prefrontal cortex-mediodorsal thalamus network for context-dependent decision making, a cortico-thalamic-cortical network for working memory and attention, and an entorhinal cortex-hippocampus network for visuospatial navigation and sequence replay. Taken together, our results demonstrate the profound insights into neuroscience-inspired computation that enable multi-plastic attractor dynamics and computation in recurrent neural circuits.

Related articles

Related articles are currently not available for this article.