Loading...
Loading...

Go to the content (press return)

Weighted contrastive divergence

Author
Romero, E.; Mazzanti, F.; Delgado, J.; Buchaca, D.
Type of activity
Journal article
Journal
Neural networks
Date of publication
2019-06
Volume
114
First page
147
Last page
156
DOI
10.1016/j.neunet.2018.09.013
Project funding
Computational Intelligence for Knowledge Discovery from G Protein-Coupled Receptors
Management and Analysis of Complex DATA
Strongly and Weakly Interacting Ultracold Quantum Matter
Repository
http://hdl.handle.net/2117/133368 Open in new window
URL
https://www.sciencedirect.com/science/article/pii/S0893608018302752 Open in new window
Abstract
Learning algorithms for energy based Boltzmann architectures that rely on gradient descent are in general computationally prohibitive, typically due to the exponential number of terms involved in computing the partition function. In this way one has to resort to approximation schemes for the evaluation of the gradient. This is the case of Restricted Boltzmann Machines (RBM) and its learning algorithm Contrastive Divergence (CD). It is well-known that CD has a number of shortcomings, and its appr...
Citation
Romero, E. [et al.]. Weighted contrastive divergence. "Neural networks", Juny 2019, vol. 114, p. 147-156.
Keywords
Contrastive divergence, Neural networks, Restricted Boltzmann machine
Group of research
IDEAI-UPC - Intelligent Data Science and Artificial Intelligence Research Center
LARCA - Laboratory of Relational Algorithmics, Complexity and Learnability
SIMCON - First-principles approaches to condensed matter physics: quantum effects and complexity
SOCO - Soft Computing