Carregant...
Carregant...

Vés al contingut (premeu Retorn)

Skip RNN: learning to skip state updates in recurrent neural networks

Autor
Campos, V.; Jou, B.; Giro, X.; Torres, J.; Chang, S.
Tipus d'activitat
Presentació treball a congrés
Nom de l'edició
6th International Conference on Learning Representations
Any de l'edició
2018
Data de presentació
2018-05-03
Llibre d'actes
Sixth International Conference on Learning Representations: Monday April 30-Thursday May 03, 2018, Vancouver Convention Center, Vancouver: [proceedings]
Pàgina inicial
1
Pàgina final
17
Projecte finançador
Computación de altas prestaciones VII
Procesado de señales multimodales y aprendizaje automático en grafos.
Repositori
http://hdl.handle.net/2117/118098 Obrir en finestra nova
URL
https://iclr.cc/Conferences/2018/Schedule?type=Poster Obrir en finestra nova
Resum
Recurrent Neural Networks (RNNs) continue to show outstanding performance in sequence modeling tasks. However, training RNNs on long sequences often face challenges like slow inference, vanishing gradients and difficulty in capturing long term dependencies. In backpropagation through time settings, these issues are tightly coupled with the large, sequential computational graph resulting from unfolding the RNN in time. We introduce the Skip RNN model which extends existing RNN models by learning ...
Citació
Campos, V., Jou, B., Giro, X., Torres, J., Chang, S. Skip RNN: learning to skip state updates in recurrent neural networks. A: International Conference on Learning Representations. "Sixth International Conference on Learning Representations: Monday April 30-Thursday May 03, 2018, Vancouver Convention Center, Vancouver: [proceedings]". 2018, p. 1-17.
Paraules clau
conditional computation, dynamic learning, recurrent neural networks
Grup de recerca
CAP - Grup de Computació d'Altes Prestacions
GPI - Grup de Processament d'Imatge i Vídeo
IDEAI-UPC Intelligent Data Science and Artificial Intelligence

Participants

Arxius