内容寻址存储器
结合属性
联想学习
认知科学
神经科学
心理学
计算机科学
人工智能
认知心理学
人工神经网络
数学
纯数学
作者
Dmitry Krotov,J. J. Hopfield
出处
期刊:Cornell University - arXiv
日期:2020-08-16
被引量:55
标识
DOI:10.48550/arxiv.2008.06996
摘要
Dense Associative Memories or modern Hopfield networks permit storage and reliable retrieval of an exponentially large (in the dimension of feature space) number of memories. At the same time, their naive implementation is non-biological, since it seemingly requires the existence of many-body synaptic junctions between the neurons. We show that these models are effective descriptions of a more microscopic (written in terms of biological degrees of freedom) theory that has additional (hidden) neurons and only requires two-body interactions between them. For this reason our proposed microscopic theory is a valid model of large associative memory with a degree of biological plausibility. The dynamics of our network and its reduced dimensional equivalent both minimize energy (Lyapunov) functions. When certain dynamical variables (hidden neurons) are integrated out from our microscopic theory, one can recover many of the models that were previously discussed in the literature, e.g. the model presented in "Hopfield Networks is All You Need" paper. We also provide an alternative derivation of the energy function and the update rule proposed in the aforementioned paper and clarify the relationships between various models of this class.
科研通智能强力驱动
Strongly Powered by AbleSci AI