Today we conclude our 2021 ICLR coverage joined by Konstantin Rusch, a PhD Student at ETH Zurich.
In our conversation with Konstantin, we explore his recent papers, titled coRNN and uniCORNN respectively, which focus on a novel architecture of recurrent neural networks for learning long-time dependencies. We explore the inspiration he drew from neuroscience when tackling this problem, how the performance results compared to networks like LSTMs and others that have been proven to work on this problem, and Konstantin’s future research goals.
This article has been published from the source link without modifications to the text. Only the headline has been changed.