Genetic Algorithm with a Large # of Genes
I'm working on a genetic algorithm for a spiking neural net project, and will probably end up with 10,000+ gains. The fitness test will take a few minutes per agent. Is this method of optimization worth the time per agent given the # of gains? I'll probably need a very large population to search the solution space. Any thoughts?
The issue with the fitness set is that it requires running the program for a while for the transients to settle out.
My understanding is that genetic algorithms are pretty much never worth it, with the one possible exception being the subcategory of genetic programming.
Basically, instead of having a huge population, the better approach is to have a population of 1 with a history buffer of size 1. Evaluate the current individual's fitness, compare it to the best score in the history buffer. If it's better, then copy the current individual to the history buffer. If not, copy the history buffer to the current individual. Mutate the current individual. Repeat
If you find that after 'too many tries', no improvemnets are made, then increase the amount of mutation that occurs at each step. If after an even larger value of 'too many tries' is met, store the current best to a file (or other long-term storage) and start over with a new random individual.
Modifications include trying X times before restoring from the history buffer (so successive mutations can add up before being determined to be inferior), or storing a history buffer of size N and eliminating the worst individual from the buffer each time a new individual must be saved. This amounts to a genetic algorithm where each individual exists in it's own environment and cannot interact with the other individuals.
For something less random, you might look at Simulated Annealing. This type of problem is solved by Optimization Algorithms
Basically, instead of having a huge population, the better approach is to have a population of 1 with a history buffer of size 1. Evaluate the current individual's fitness, compare it to the best score in the history buffer. If it's better, then copy the current individual to the history buffer. If not, copy the history buffer to the current individual. Mutate the current individual. Repeat
If you find that after 'too many tries', no improvemnets are made, then increase the amount of mutation that occurs at each step. If after an even larger value of 'too many tries' is met, store the current best to a file (or other long-term storage) and start over with a new random individual.
Modifications include trying X times before restoring from the history buffer (so successive mutations can add up before being determined to be inferior), or storing a history buffer of size N and eliminating the worst individual from the buffer each time a new individual must be saved. This amounts to a genetic algorithm where each individual exists in it's own environment and cannot interact with the other individuals.
For something less random, you might look at Simulated Annealing. This type of problem is solved by Optimization Algorithms
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
Quote: Original post by Extrarius
My understanding is that genetic algorithms are pretty much never worth it, with the one possible exception being the subcategory of genetic programming.
Basically, instead of having a huge population, the better approach is to have a population of 1 with a history buffer of size 1. Evaluate the current individual's fitness, compare it to the best score in the history buffer. If it's better, then copy the current individual to the history buffer. If not, copy the history buffer to the current individual. Mutate the current individual. Repeat
If you find that after 'too many tries', no improvemnets are made, then increase the amount of mutation that occurs at each step. If after an even larger value of 'too many tries' is met, store the current best to a file (or other long-term storage) and start over with a new random individual.
Modifications include trying X times before restoring from the history buffer (so successive mutations can add up before being determined to be inferior), or storing a history buffer of size N and eliminating the worst individual from the buffer each time a new individual must be saved. This amounts to a genetic algorithm where each individual exists in it's own environment and cannot interact with the other individuals.
For something less random, you might look at Simulated Annealing. This type of problem is solved by Optimization Algorithms
I see you're an Evolution Strategies person. Not to mention its the (1, 1)-ES that was considered the original with a variation of the 1/5 rule.
Now to the original poster, what are you encoding into your genes? Are you using a binary encoding? Or are all the values real values? Are you converting real values to binary values? If so, at what precision, or rather why not just use real values in the first place. Also, generally, an elitist GA doesn't need more than a population size of 100 to perform well. Usually somewhere around 30 works fairly well.
Quote: Original post by WeirdoFuQuote: Original post by Extrarius
My understanding is that genetic algorithms are pretty much never worth it, with the one possible exception being the subcategory of genetic programming.
Basically, instead of having a huge population, the better approach is to have a population of 1 with a history buffer of size 1. Evaluate the current individual's fitness, compare it to the best score in the history buffer. If it's better, then copy the current individual to the history buffer. If not, copy the history buffer to the current individual. Mutate the current individual. Repeat
If you find that after 'too many tries', no improvemnets are made, then increase the amount of mutation that occurs at each step. If after an even larger value of 'too many tries' is met, store the current best to a file (or other long-term storage) and start over with a new random individual.
Modifications include trying X times before restoring from the history buffer (so successive mutations can add up before being determined to be inferior), or storing a history buffer of size N and eliminating the worst individual from the buffer each time a new individual must be saved. This amounts to a genetic algorithm where each individual exists in it's own environment and cannot interact with the other individuals.
For something less random, you might look at Simulated Annealing. This type of problem is solved by Optimization Algorithms
I see you're an Evolution Strategies person. Not to mention its the (1, 1)-ES that was considered the original with a variation of the 1/5 rule.
Now to the original poster, what are you encoding into your genes? Are you using a binary encoding? Or are all the values real values? Are you converting real values to binary values? If so, at what precision, or rather why not just use real values in the first place. Also, generally, an elitist GA doesn't need more than a population size of 100 to perform well. Usually somewhere around 30 works fairly well.
I'm encoding short int values into the genes. The values contain the connectivity of the network, and proportion of excitatory to inhibitory neurons. I'm considering encoding the synapse gains and function of each neuron as well, but that would increase the solution space to astronomical levels, and the gains are changing dynamically anyway.
I've already planned on using elitism in the algorithm...I figured on a little elitism, a little mutation, a little crossover, and some roulette thrown in for kicks.
As to the hill climbing idea, that should be covered by the mutation that I'm already planning on. I've used simulated annealing and momentum type operations on a normal backprop neural net... I suppose that they would have a parallel on the hill-climber.
Quote: Original post by Extrarius
My understanding is that genetic algorithms are pretty much never worth it, with the one possible exception being the subcategory of genetic programming.
...
Modifications include trying X times before restoring from the history buffer (so successive mutations can add up before being determined to be inferior), or storing a history buffer of size N and eliminating the worst individual from the buffer each time a new individual must be saved. This amounts to a genetic algorithm where each individual exists in it's own environment and cannot interact with the other individuals.
... which is bound to give you poor results since one of the major benefits of genetic algorithms comes from the interaction across the population.
In situations where there are many orthogonal factors making up the fitness level, I'd be very surprised if simulated annealing or a mutation-only approach worked as well as a genetic algorithm.
Quote: Original post by KylotanYou have it backwards. Consider the ultimate orthogonal problem: a binary encoding where every bit is independent from the others. A trivial hill-climber would find the optimal solution in N tries of N bits; a genetic algorithm on the other hand would waste millions of cycles on maintaining large populations of needless junk and would still not be guaranteed to find the solution (due to the crossover and mutation factors stomping on each other) unless you employed elitism (which is a type of hill-climbing, btw).
In situations where there are many orthogonal factors making up the fitness level, I'd be very surprised if simulated annealing or a mutation-only approach worked as well as a genetic algorithm.
Genetic algorithms are cool in name and they are useful for a _generic_ framework where you want to (nearly) solve a arbitrary problems. Other than that they're just plain feeble and for any specific problem other approaches such as hill-climbing approaches (in which I include e.g. simulated annealing, TABU search, etc) can find better solutions faster.
Christer Ericson
http://realtimecollisiondetection.net/blog/
http://realtimecollisiondetection.net/blog/
Quote: Original post by Christer Ericson
You have it backwards. Consider the ultimate orthogonal problem: a binary encoding where every bit is independent from the others. A trivial hill-climber would find the optimal solution in N tries of N bits; a genetic algorithm on the other hand would waste millions of cycles on maintaining large populations of needless junk and would still not be guaranteed to find the solution (due to the crossover and mutation factors stomping on each other) unless you employed elitism (which is a type of hill-climbing, btw).
Most people aren't trying to optimise a binary value. Of course you're not going to waste your time on such a method if there are only 2 discrete outputs! You wouldn't use a binary search on a 2 element array either. That doesn't mean that a binary search isn't theoretically better than a linear one on sorted data. When each factor in the problem is more complicated than your simple example, it's quite a different story.
And obviously it works better on problems that are largely orthogonal rather than entirely orthogonal, otherwise you may as well just search through the permutations. GAs work well in that middle ground between smoothly continuous surfaces and sets of unrelated factors. They are reasonably robust in the face of discontinuities and local maxima.
And your analogy is poor in that "maintaining large populations of needless junk" is no different than iterating endlessly through values to find a maximum. If your surface is discrete or full of local maxima then plain gradient ascent is not going to be very optimal. I'm sure there is a better solution to any given problem, given an in-depth understanding of your search space, but GAs are still very good for many situations.
Quote: Original post by KylotanHow does this relate to what I said? My example was equivalent to finding the optimum for the function "f(x) = number of bits set in x". This is a prime example of, in your own words, a situation "where there are many orthogonal factors making up the fitness level". You said you'd be surprised if simulated annealing or a mutation-only approach worked as well as a genetic algorithm for such a problem and I pointed out that such problems are what hill-climbing approaches excel at. It should be obvious why that is the case!
Most people aren't trying to optimise a binary value.
Perhaps you meant to say correlated factors rather than orthogonal (which translates to uncorrelated) factors?
Christer Ericson
http://realtimecollisiondetection.net/blog/
http://realtimecollisiondetection.net/blog/
Quote: Original post by KylotanFor many problems, crossover and other interaction increases the time it takes to find a solution.
[...]... which is bound to give you poor results since one of the major benefits of genetic algorithms comes from the interaction across the population.[...]
Anyways, I'm not well-educated about this stuff, I was just putting together bits and pieces I remebered hearing/reading/etc =-)
"Walk not the trodden path, for it has borne it's burden." -John, Flying Monk
Quote: Original post by Extrarius
For many problems, crossover and other interaction increases the time it takes to find a solution.
I personally have never found that to be the case. The convergence rate of mutation search (finite horizon random walk) is significantly slower than that of the canonical GA (selection+crossover+mutation).
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement