Advertisement

MLP + overfitting

Started by September 26, 2007 02:45 AM
13 comments, last by Idov 17 years, 2 months ago
Hi! I'm using MATLAB to simulate MLP. so i'm using a validation set, but when i draw the error graph, I see the the error of the validation set is going down and then flattens. I DO want it to go up after a while... but it doesn't. Anybdoy knows why it can happen? Do you have a link or code example? thanks!
That can happen because the model doesn't have enough parameters to allow overfitting. You'll probably see the effect you expect if you add more neurons to the hidden layer(s).
Advertisement
I tried that.
It does nothing but making the validation error higher. :(
Just a question here - why do you want it to exhibit overtraining? Isn't overtraining a bad thing?
It is, but i need it for an exercise i'm doing.
plus it helps me know when I should training the network :)
Have you tried a lower learning rate? You may be pushing the training to the nearest local minimum. You might see what you're looking for if you decrease the learning rate and run for a higher number of iterations. Also keep the number of hidden nodes on the high side and you should see a badly overtrained network.

Seems weird to offer suggestions on how to overtrain. 8^O
Advertisement
Use a smaller training set? Or select a training set that misses out very important examples...

Join us in Vienna for the nucl.ai Conference 2015, on July 20-22... Don't miss it!

Ok, thanks! It goes up a little bit.

But I have another question...
I'm tryinf to reduce the variance error by adding more "experts" but as I add more and more of them the variance doesn't go down and even worse - the validation set error GOES UP! (it's not going down and goes up, it's more like a high straight line)

It's very strange because i'm pretty sure i calculated the variance well.

can you tell me why?
When you say "adding more experts" do you mean that you are developing an ensemble model? This would involved training multiple, independent models (neural nets in your case) and combining the results. Is this correct?

-Kirk
quote]Original post by kirkd
When you say "adding more experts" do you mean that you are developing an ensemble model? This would involved training multiple, independent models (neural nets in your case) and combining the results. Is this correct?

-Kirk[/quote

yes :)

[Edited by - Idov on September 27, 2007 2:33:16 PM]

This topic is closed to new replies.

Advertisement