Simple backprop question
Assuming that I have a multilayer perceptron, backprop and the ''standard'' sigmoid function.
Is it a ''requirement'' that is fully connected? (upper layer neurons connected with all the neurons in the next layer)
Intuitively it seems that it wont be able to generalize as well as a fully connected one but my math background nor my AI is enough to understand if it will ''break'' the logic of the backprop algorithm.
Thanks!
-Mat
It won''t break, and I even read that the resulting NN will work more human-like (as our brain, as far as it is comparable to NN, is n''t massively connected either). If that influences gameplay, I don''t know.
Blaat
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement