Backpropagation question
If I want to construct the neural network using feed-forward and back-propogation , what factor should I use for determining the number of hidden
neurons(hidden layer) of the network for a given problems.I know the input neuron should equal the number of input of your problem and the output neuron should equal the number of output of your
problem.
I googled for `number neurons hidden layer' and quickly found this. It has a section devoted to your question.
Please, make a little more effort to find stuff out on your own before asking here.
Please, make a little more effort to find stuff out on your own before asking here.
it depends on what you want to do. basically there is no rule. one hidden layer is often sufficient a lot of purporses. don't forget that the more hidden layers you have the more complicate your net becomes...
The book 'AI for game programmers' tells us this:
"For three-layer networks in which you're not interested in autoassociation, the appropriate number of hidden neurons is approximately equal to the square root of the product of the number of input and output neurons. This is just an approximation, but it's as good a place to start as any"
I can't speak from personal experience since I'm just learning this stuff myself.
"For three-layer networks in which you're not interested in autoassociation, the appropriate number of hidden neurons is approximately equal to the square root of the product of the number of input and output neurons. This is just an approximation, but it's as good a place to start as any"
I can't speak from personal experience since I'm just learning this stuff myself.
Another way of dealing with this problem is so called cascade correlation, which build the network structure in the learning process, personally I have no experience with it. See also the articles Optimal Brain Damage and Efficient BackProp by Prof. Yann Le Cun, and there's more interesting stuff on his website http://yann.lecun.com/exdb/publis/index.html
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement