Different ways of training NN's
So far, I have seen...
Genetic Algorithms
Back Propogation
What other techniques are there for training NN''s? Any keywords I should be googling for?
Thanks
well, all those backpropagation mods like bprop+momentum, quickprop ..
and then maybe backpercolation, cascade correlation, SOM, LVQ, Counterpropagation, jordan/elman nets, hopfield nets, boltzmann machines, ART - nets, neocognitron ...
maybe you should tell us what types of neural networks you wanna use, for which purpose etc.
[edited by - as31415rin on September 10, 2003 5:17:54 AM]
and then maybe backpercolation, cascade correlation, SOM, LVQ, Counterpropagation, jordan/elman nets, hopfield nets, boltzmann machines, ART - nets, neocognitron ...
maybe you should tell us what types of neural networks you wanna use, for which purpose etc.
[edited by - as31415rin on September 10, 2003 5:17:54 AM]
Resilient Propagation, or RProp is truly awesome...
Join us in Vienna for the nucl.ai Conference 2015, on July 20-22... Don't miss it!
quote: Original post by Raeldor
So far, I have seen...
Genetic Algorithms
Back Propogation
What other techniques are there for training NN''s? Any keywords I should be googling for?
I assume that you mean feedforward neural networks, specifically? Any of the usual numeric optimization techniques (hill climbing, conjugate gradients, simplex, etc.) are fair game, plus any of the newer blind optimization techniques (notably genetic algorithms and simulated annealing, their relatives, and variations). Also, a number of neural network-specific training algorithms have been devised, like CHIR and NOVEL.
See the comp.ai.neural-nets (Usenet) FAQ for more information.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement