Advertisement

A trainable AI-based mechanical arm.

Started by August 28, 2005 05:21 AM
12 comments, last by shadowisadog 19 years, 3 months ago
About the local min/max issues, one way you might want to look at dealing with it is to take ideas from simulated annealing (google it and you can get a much better explanation than I can give).

The idea of a genetic algorithm is very similar to survival of the fittest: portions of the most fit members of the population have their "genes" reproduced the most in later generations.

Simulated annealing is a bit different, and I think best viewed with an analogy. Consider your solution space as a surface, where the height of the surface refers to the fitness of the particular solution (where lower is more fit). Now, think about the current solution you are considering as a rolling ball. You initially just chuck the ball at the solution space, and let it roll around for a while. As time advances, it loses speed and makes smaller moves. The ball will have a hard time getting stuck in local minimums in the early generations, because it still has enough energy to roll out. In general, the ball will roll towards the closest local minimum.

Like I said, you can find better descriptions of simulated annealing on the web, but the basic idea that you can add to genetic algorithms is to have a large mutation rate in the beginning, and decrease that rate as time goes on. This will help jolt your algorithm out of early local maximums, but since you reduce it over time, then hopefully later on when you get to the global maximum, you won't be mutated away from it because the rate has decreased.
Cool, but shouldn't it just pick the best winners from the population? IE last and second last saved DNA? So this way instead of possibly deviating from the optimal soltuion it would advance forward?
Advertisement
Quote: Original post by shadowisadog
Cool, but shouldn't it just pick the best winners from the population? IE last and second last saved DNA? So this way instead of possibly deviating from the optimal soltuion it would advance forward?


Well surely you should move *towards* the optimal solution, but to simply take the previously best DNA could easily get you stuck at a local optimum. The idea is that some sub local optimal DNA could down the line lead to a solution that has a better local optimum.

The standard way of dealing with this is to have a population of DNA, where instead of simply taking the best DNA to reproduce, you let each DNA reproduce some, with the amount they reproduce proportional to their fitness. Of course, there are many ways of implementing this concept, but thats the gist of it.

A second way of dealing with it is to randomly mutate your DNA to ensure that you have actually found the global optimum. Consider this: if you are at the global optimum, and mutate away, then at some point you will make it back to that optimum. On the other hand, if you are at a local optimum, and mutate away, there is a chance that you would move towards a better solution. The important thing here is that you are indeed moving towards optimums (by following gradients), its just your motion getting there is a bit randomized.
In 21732 generations I achieved a distance of 709 meters with 23 stored DNA's :P.

This topic is closed to new replies.

Advertisement