The Curse of Memory in Recurrent Neural Networks
组织者
演讲者
孙泽钜
时间
2022年12月27日 21:00 至 21:30
地点
Online
摘要
In this talk, we will discuss the paper entitled "On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis" by Z. Li, et al. In this paper, the authors consider the setting of using continuous-time linear RNNs to learn from data generated by linear relationships. The paper uncovered the fact that when there is long-term memory in the target, it takes a large number of neurons to approximate it, and the phenomenon is called the "curse of memory". The analysis serves to be a basic step towards a concrete mathematical understanding of the deep learning using recurrent architectures.