Advertisement

Simple backprop question

Started by January 22, 2004 06:21 AM
1 comment, last by matias suarez 21 years, 1 month ago
Assuming that I have a multilayer perceptron, backprop and the ''standard'' sigmoid function. Is it a ''requirement'' that is fully connected? (upper layer neurons connected with all the neurons in the next layer) Intuitively it seems that it wont be able to generalize as well as a fully connected one but my math background nor my AI is enough to understand if it will ''break'' the logic of the backprop algorithm. Thanks! -Mat
It won''t break, and I even read that the resulting NN will work more human-like (as our brain, as far as it is comparable to NN, is n''t massively connected either). If that influences gameplay, I don''t know.




Blaat
Advertisement
A fully connected BPN has the advantage that the forward computation of each layer can be implemented as a multiplication of a matrix (weights) and a vector (previous layer) and then transforming each element of the resulting vector with the sigmoid function.

This topic is closed to new replies.

Advertisement