Recurrent Neural Networks: Recent Advances and the Future   January 13, 2017, 11AM

302- 309-1

 

Abstract:

 

Although recent resurgence of Recurrent Neural Networks (RNN) has achieved remarkable advances in various sequence modeling problems, we are still missing some key abilities of RNNs necessary to model more challenging yet important natural phenomena. In this talk, as a step toward such advances, introduce a new RNN architecture, called Hierarchical Multiscale Recurrent Neural Networks (HM-RNN). In the HM-RNN, each layer in a multi-layered RNN learns different time-scales, adaptively to the inputs provided from the lower layer. I argue the advantages of HM-RNNs in terms of computational efficiency, capturing longer-term dependencies, and finding latent structures inherent in the sequence. I conclude the talk with a discussion on the key challenges that lie ahead.