Carregant...
Carregant...

Vés al contingut (premeu Retorn)

Comparing error minimized extreme learning machines and support vector sequential feed-forward neural networks

Autor
Romero, E.; Alquezar, R.
Tipus d'activitat
Article en revista
Revista
Neural networks
Data de publicació
2012-01
Volum
25
Número
1
Pàgina inicial
122
Pàgina final
129
DOI
https://doi.org/10.1016/j.neunet.2011.08.005 Obrir en finestra nova
Repositori
http://hdl.handle.net/2117/16872 Obrir en finestra nova
URL
http://www.ncbi.nlm.nih.gov/pubmed/21959130 Obrir en finestra nova
Resum
Recently, error minimized extreme learning machines (EM-ELMs) have been proposed as a simple and efficient approach to build single-hidden-layer feed-forward networks (SLFNs) sequentially. They add random hidden nodes one by one (or group by group) and update the output weights incrementally to minimize the sum-of-squares error in the training set. Other very similar methods that also construct SLFNs sequentially had been reported earlier with the main difference that their hidden-layer weights ...
Citació
Romero, E.; Alquezar, R. Comparing error minimized extreme learning machines and support vector sequential feed-forward neural networks. "Neural networks", Gener 2012, vol. 25, núm. 1, p. 122-129.
Paraules clau
Error Minimized Extreme Learning Machines, Sequential Approximations, Support Vector Sequential Feed-forward Neural Networks
Grup de recerca
IDEAI-UPC Intelligent Data Science and Artificial Intelligence
SOCO - Soft Computing

Participants