Loading...
Loading...

Go to the content (press return)

Comparing fixed and adaptive computation time for recurrent neural networks

Author
Fojo, D.; Campos, V.; Giro, X.
Type of activity
Presentation of work at congresses
Name of edition
6th International Conference on Learning Representations
Date of publication
2018
Presentation's date
2018-05-01
Book of congress proceedings
Sixth International Conference on Learning Representations: Monday April 30-Thursday May 03, 2018, Vancouver Convention Center, Vancouver: [proceedings]
First page
1
Last page
8
Project funding
High performance computing VII
Multimodal Signal Processing and Machine Learning on Graphs
Repository
http://hdl.handle.net/2117/118497 Open in new window
URL
https://iclr.cc/Conferences/2018/Schedule?type=Workshop Open in new window
Abstract
Deep networks commonly perform better than shallow ones, but allocating the proper amount of computation for each particular input sample remains an open problem. This issue is particularly challenging in sequential tasks, where the required complexity may vary for different tokens in the input sequence. Adaptive Computation Time (ACT) was proposed as a method for dynamically adapting the computation at each step for Recurrent Neural Networks (RNN). ACT introduces two main modifications to the r...
Citation
Fojo, D., Campos, V., Giro, X. Comparing fixed and adaptive computation time for recurrent neural networks. A: International Conference on Learning Representations. "Sixth International Conference on Learning Representations: Monday April 30-Thursday May 03, 2018, Vancouver Convention Center, Vancouver: [proceedings]". 2018, p. 1-8.
Keywords
ACT, RNN, adaptive computation time, neural network, recurrent neural network, variable computation
Group of research
GPI - Image and Video Processing Group
IDEAI-UPC - Intelligent Data Science and Artificial Intelligence Research Center

Participants

  • Fojo, Daniel  (author and speaker )
  • Campos Camúñez, Victor  (author and speaker )
  • Giro Nieto, Xavier  (author and speaker )

Attachments