报告题目:Embedding Theory of Reservoir Computing and Reducing Reservoir Network Using Time Delays
报告人:马欢飞 教授,苏州大学
时间:2023年6月16日(星期五),14:30—15:30
腾讯会议号:853-929-863
报告摘要:
Reservoir computing (RC), a particular form of recurrent neural network, is under explosive development due to its exceptional efficacy and high performance in reconstruction or/and prediction of complex physical systems. However, the mechanism triggering such effective applications of RC is still unclear, awaiting deep and systematic exploration. Combining the delayed embedding theory with the generalized embedding theory, we prove that RC is essentially a high dimensional embedding of the original input nonlinear dynamical system. Thus, using this embedding property, we unify into a universal framework the standard RC and the time-delayed RC where we novelly introduce time delays only into the network’s output layer, and we further find a trade-off relation between the time delays and the number of neurons in RC. Based on these findings, we significantly reduce the RC’s network size and promote its memory capacity in completing systems reconstruction and prediction. More surprisingly, only using a single-neuron reservoir with time delays is sometimes sufficient for achieving reconstruction and prediction tasks, while the standard RC of any large size but without time delay cannot complete them yet.
报告人简介:
马欢飞,苏州大学数学科学学院教授,博士生导师,2010年毕业于复旦大学,获博士学位,2010年到苏州大学数学科学学院参加工作至今。2012年1月-2月在香港城市大学数学系任访问学者,2012年4月至2013年8月在日本东京大学任博士后研究员。主要研究方向为非线性动力系统的方法与应用、计算系统生物学。近年来主持国家自然科学基金项目多项、参与重大研究计划集成项目、重点研发计划子课题等项目,获2019年世界华人数学家联盟最佳论文奖,第七届江苏省数学成就奖,2020年入选教育部国家级人才计划青年项目。