Quote: Original post by fup
"Your GA can't evolve complexity or simplicity to solve the problem at hand."
They can actually. Here's an example:
http://www.cs.utexas.edu/users/kstanley/
I stand corrected! Thanks for the link, that's pretty neat.
Quote: Original post by fup
"Your GA can't evolve complexity or simplicity to solve the problem at hand."
They can actually. Here's an example:
http://www.cs.utexas.edu/users/kstanley/
Quote: Original post by fup
GPs encode solution candidates as trees of operators consisting of function nodes and terminal nodes.
Quote: Original post by Yuri Burger
Is there somebody to help me with free GA library?
there is C++ GPL library of genetic algorithms http://sourceforge.net/projects/cpplibga/
The help in development of library is necessary...
Quote: Original post by Nathaniel Hammen
I have programmed my own Genetic Program. To test it, I had it solve the distance between two points. On my first trial it got it by the tenth generation. On my second trial, I had it run for 224 generation before I decided to quit it. It was still off by more than 1,000. It also got slower and slower as the program trees evolved more and more complexity to try to solve such a simple problem.
Quote: Original post by Nathaniel Hammen
On my third trial it got it on the 31st generation. I looked at the code that the third trial generated and it was essentially If(false&&a whole lot of shit , more shit , more if statements that eventually led to the algorithm). So these if statements were sort of like dominant and recessive genes. I guess thats a good thing, because some negative genes can carry over until they're needed, but it makes it so that there is a lot of code just to get sqrt((L-O)dot(L-O)).
Quote: Original post by Nathaniel Hammen
I'm also concerned about the results I got on the second trial. It just couldn't come up with the answer. Is there any way I can prevent this, or do I need to check the program every once in a while, just in case. I was thinking that maybe I should introduce a brand new member into the population every once in a while, but this would probably make my programs worse.
Quote: Original post by Nathaniel HammenOn my third trial it got it on the 31st generation. I looked at the code that the third trial generated and it was essentially If(false&&a whole lot of shit , more shit , more if statements that eventually led to the algorithm). So these if statements were sort of like dominant and recessive genes. I guess thats a good thing, because some negative genes can carry over until they're needed, but it makes it so that there is a lot of code just to get sqrt((L-O)dot(L-O)).
Quote: Original post by Nathaniel HammenI'm also concerned about the results I got on the second trial. It just couldn't come up with the answer. Is there any way I can prevent this, or do I need to check the program every once in a while, just in case. I was thinking that maybe I should introduce a brand new member into the population every once in a while, but this would probably make my programs worse.
Quote: Original post by Nathaniel Hammen
I'm also concerned about the results I got on the second trial. It just couldn't come up with the answer. Is there any way I can prevent this, or do I need to check the program every once in a while, just in case. I was thinking that maybe I should introduce a brand new member into the population every once in a while, but this would probably make my programs worse.
Quote: Original post by Woodsman
If you can, enforce a maximum depth of your trees, whether as a result of crossover or otherwise. When one child is too large, copy a random parent instead.
Quote: Original post by Woodsman
Look up 'editing'. Basically, you give it patterns that go in and edit the results. An example would be pattern:
' (anything) | True' can be replaced with true.
Or 'iflessthanzero (some constant) x y' replace with x or y depending on the constant.
Quote: Original post by Woodsman
What I've generally seen is multiple runs, rather than larger populations or number of generations. There is no guarantee that any particular run will find a solution. Increasing the number of runs will increase the chance.
You may want to save, say, the ten best programs generated from one complete run, and introduce them into the next run either in the initial generation or later on. Perhaps even take these best 10 from several runs and then run them all together.
Quote: Original post by kwatz
You may want to remove control statement-related alleles (the if statement, true/false, etc), as you're just looking for a plain mathematical formula, not a computer program per se.
Quote: Original post by kwatz
There are other elements that may be problems, also. There are a number of parameters to tune before you should expect your algorithm to work well. Depending on the algorithm and representation you use, these can include mutation rate, crossover rate, degree of mutation, population size, fitness function, selection methodetc. Depending on the behavior of your population, you can tweak these and related values back and forth.
Could you give some specifics about your algorithm? For example, what crossover method you're using (if any), selection method, population size, and so on.
Quote: Original post by Anonymous Poster
There is lots of literature on this. GP is not a simple process (although that doesn't stop lots of people from just diving in, naively thinking it'll work first time).
Quote: Original post by Anonymous Poster
The best advice is simply to use google and read the 4 or 5 main conferences (and their papers and journals) which are devoted to e.g. "machine learning". There are hundreds of academic papers; many are just individual techniques, so they're kind of like a shopping list of "if you want to do X, read this paper".
You should especially look at:
- ADF == automatically defined functions.
- "bushiness" vs "depth"
there are many papers on each, and most contain algorithms you can copy and use in your own work.
Quote: Original post by Anonymous Poster
You're probably best off just buying a recent book on GP, though, since a good one ought to cover a variety of techniques and algorithms...
Quote: Original post by Anonymous Poster
Be skeptical of the genetic algorithm. There is no garauntee that after n generations they will perform better. Genetic algorithms are a kind of educated random step. If you want to look at an algorithm that touches more closely to what I think you were talking about check out this paper (http://www.debonet.com/Research/Publications/1996/DeBonet-NIPS96-MIMIC.pdf). Basically this algorithm keeps track of the history of generations to better forecast the best next generation.