Advertisement

neural networks and genetic algorithms

Started by May 11, 2009 05:24 AM
21 comments, last by BanEvadingGordon 15 years, 6 months ago
Quote:
For the various parameters, I used something like:
0.8 chance that two selected individuals would be crossed over (if not, then they both went into the next generation unchanged)
0.05 chance of each weight getting mutated (which might even be a little high)
10.0 maximum mutation amount (so a weight could be mutated by a random value from -10 to 10)


your rule of thumb numbers I have used something similar now. I realise perhaps my GA's convergence was during the time when i was infact only mutating a small number of weights rather than all of them a small amount.

Perhaps the change of GA to small number of mutations and elitist selection made me think this was what was working, when it was just populating the the population with the elite and so getting a better average score.

since then I continued with
With elitist selection and mutating all weights a small amount and sore each time i did it. a slow collapse of fitness.

but by returning as suggested by this forum to rare but larger mutations of weights and a much less elitist selection - fitness scores seem to be rising again.


---
to all:
i now wonder if there may be a rule of thumb to use about population size - should the number of things you mutate be related to population size?

at the moment i mutate on average 25 weights out of many thousands. does this imply that the population should be at least 25?
Make your mutation rate probabilistic - with 25 mutations per 1000, that gives you 2.5% chance of mutation for each weight. Seems reasonable. Then each individual has a chance of mutation so you don't tie it to population size, which I think is too much interdependency of parameters.

I would also suggest that you need a range of mutation levels from less disruptive to more disruptive which are also probabilistically chosen. Sometimes a large mutation is necessary to jump out of a local optimum while sometimes a small mutation is all that is needed to locally optimize a result. You could easily enough control this by having the magnitude of the weight modification pulled from a Gaussian distribution. Big changes are unlikely, but not impossible, while smaller changes are common. If you're always making large mutational changes your doing a good job of exploring the response surface but not such a good job of exploiting those good areas of the fitness landscape.

Advertisement
many of you no doubt know it already?

but i think i also had a problem of small population. 20 agents, and each neural agent was getting about 20-30 mutations per generation.

with the number of mutations significantly greater than the number of agents in the population. i think the whole thing just degraded through 'incest'



now I have far few mutations and cross over and fitness seems to improve reliably. cheers.

This topic is closed to new replies.

Advertisement