Advertisement

Backpropagation

Started by September 24, 2004 06:36 AM
5 comments, last by fup 20 years, 2 months ago
I'm trying to write a simple neural network with the backpropagation algorithm. After reading a thousand papers :) I think I have understand it quite good, but I miss something in the backpropagation step. I can calculate all the deltas and update weights, but what about the neuron bias? It will always be fixed or I should update it as if it were a weight? Thank you nICO
you can think of the bias as an additional input into each neuron, with a constant value of 1 (or -1, it makes no difference). Each neuron therefore, will have an additional weight associated with the bias input. It is this weight that the backprop algorithm should update.
Advertisement
Ok, thank you! It works!!!

Btw, can someone give me a link to some explanation on unsupervised networks?

nICO
If you look on AI Junkie, Mat uses A GA to self-train the network - is that useful for you?
Yes, thank you, that's useful.
I also found useful information in the sites in the links section of the site. (I'm currently reading NEAT documentation... seems really interesting.)

bye
nICO
Quote: Original post by fup
you can think of the bias as an additional input into each neuron, with a constant value of 1 (or -1, it makes no difference). Each neuron therefore, will have an additional weight associated with the bias input. It is this weight that the backprop algorithm should update.



Isnt each link weight feeding into the nodes Summation to be corrected by the backprop??
Advertisement
Isnt each link weight feeding into the nodes Summation to be corrected by the backprop??

yes

This topic is closed to new replies.

Advertisement