Recent experiments have revealed a hierarchy of time scales in the visual cortex, where different stages of the visual system process information at different time scales. Recurrent neural networks are ideal models to gain insight in how information is processed by such a hierarchy of time scales and have become widely used to model temporal dynamics both in machine learning and computational neuroscience. However, in the derivation of such models as discrete time approximations of the firing rate of a population of neurons, the time constants of the neuronal process are generally ignored. Learning these time constants could inform us about the time scales underlying temporal processes in the brain and enhance the expressive capacity of the network. To investigate the potential of adaptive time constants, we compare the standard approximations to a more lenient one that accounts for the time scales at which processes unfold. We show that such a model performs better on predicting simulated neural data and allows recovery of the time scales at which the underlying processes unfold. A hierarchy of time scales emerges when adapting to data with multiple underlying time scales, underscoring the importance of such a hierarchy in processing complex temporal information.

译文

最近的实验揭示了视觉皮层中时间尺度的层次结构,视觉系统的不同阶段以不同的时间尺度处理信息。递归神经网络是理想的模型,可以洞悉如何通过这种时间尺度层次来处理信息,并且已广泛用于对机器学习和计算神经科学中的时间动力学进行建模。但是,在推导诸如神经元群体的放电速率的离散时间近似值之类的模型时,通常会忽略神经元过程的时间常数。学习这些时间常数可以使我们了解大脑中时间过程背后的时间尺度,并增强网络的表达能力。为了研究自适应时间常数的潜力,我们将标准近似值与更宽松的近似值进行了比较,该近似值说明了过程展开的时间尺度。我们证明,这样的模型在预测模拟神经数据方面表现更好,并且可以恢复基础过程展开的时间尺度。当适应具有多个基本时标的数据时,会出现时间尺度的层次结构,这突显了这种层次结构在处理复杂的时间信息中的重要性。

+1
+2
100研值 100研值 ¥99课程
检索文献一次
下载文献一次

去下载>

成功解锁2个技能,为你点赞

《SCI写作十大必备语法》
解决你的SCI语法难题!

技能熟练度+1

视频课《玩转文献检索》
让你成为检索达人!

恭喜完成新手挑战

手机微信扫一扫,添加好友领取

免费领《Endnote文献管理工具+教程》

微信扫码, 免费领取

手机登录

获取验证码
登录