人工神经网络
非线性系统
计算机科学
等级制度
神经计算模型
系列(地层学)
前馈
突触
突触重量
人工智能
计算
Volterra系列
算法
神经科学
物理
工程类
控制工程
古生物学
生物
量子力学
经济
市场经济
作者
Wolfgang Maass,Eduardo D. Sontag
标识
DOI:10.1162/089976600300015123
摘要
Experimental data show that biological synapses behave quite differently from the symbolic synapses in all common artificial neural network models. Biological synapses are dynamic; their "weight" changes on a short timescale by several hundred percent in dependence of the past input to the synapse. In this article we address the question how this inherent synaptic dynamics (which should not be confused with long term learning) affects the computational power of a neural network. In particular, we analyze computations on temporal and spatiotemporal patterns, and we give a complete mathematical characterization of all filters that can be approximated by feedforward neural networks with dynamic synapses. It turns out that even with just a single hidden layer, such networks can approximate a very rich class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics. Our characterization result provides for all nonlinear filters that are approximable by Volterra series a new complexity hierarchy related to the cost of implementing such filters in neural systems.
科研通智能强力驱动
Strongly Powered by AbleSci AI