计算机科学
人工神经网络
人工智能
图灵
计算
噪音(视频)
循环神经网络
作者
Emmett Redd,Tayo Obafemi-Ajayi
标识
DOI:10.1007/978-3-030-86380-7_38
摘要
Noise and stochasticity can be beneficial to the performance of neural networks. Recent studies show that optimized-magnitude, noise-enhanced digital recurrent neural networks are consistent with super-Turing operation. This occurred regardless of whether true random or sufficiently long pseudo-random number time series implementing the noise were used. This paper extends prior work by providing additional insight into the degrading effect of shortened and repeating pseudo-noise sequences on super-Turing operation. Shortening the repeat length in the noise resulted in fewer chaotic time series. This was measured by autocorrelation detected repetitions in the output. Similar rates of chaos inhibition by the shortening of the noise repeat lengths hint to an unknown, underlying commonality in noise-induced chaos among different maps, noise magnitudes, and pseudo-noise functions. Repeat lengths in the chaos-failed outputs were predominately integer multiples of the noise repeat lengths. Noise repeat lengths only marginally shorter than output sequences cause the noise-enhanced digital recurrent neural networks to repeat and, thereby, fail in being consistent with chaos and super-Turing computation. This implies that noise sequences used to improve neural network operation should be at least as long as any sequence it produces.
科研通智能强力驱动
Strongly Powered by AbleSci AI