neural networks and genetic algorithms
Selecting which genes to transfer is generally a random business. However, if your have multiple fitness functions, you may be able to narrow down which genes could stand to be swapped and which are just fine.
Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC
Professional consultant on game AI, mathematical modeling, simulation modeling
Co-founder and 10 year advisor of the GDC AI Summit
Author of the book, Behavioral Mathematics for Game AI
Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play
"Reducing the world to mathematical equations!"
in a genetic algorith for the task i have in mind, the ai for a racing game. I have tried genetically mutating parameters for a procedural state based approach to work out how much, and by how far a variety of factors that trigger the procedural AI. This works rather well, but the AI was still not quite good enough to achieve human level performance.
the neural network version worked better, but i used asexual reproduction. I feared that randomly flipping the parenthood of the weights of possible very different nerual configurations might jam the networks function.
i was happy with the ai but had not trained it with competitors. so that is why i'm doing more training again - the neural net is a feed forward type.
Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC
Professional consultant on game AI, mathematical modeling, simulation modeling
Co-founder and 10 year advisor of the GDC AI Summit
Author of the book, Behavioral Mathematics for Game AI
Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play
"Reducing the world to mathematical equations!"
i think it's the semidecideability that will cause mutations to converge faster. but this wouldn't depend on whether to use asexual or sexual reproduction. the "quantum neuron" itself will fluctuate over all states with a wave function that not only emulates n-ary reproduction, but reduces the necessary generations by something like a factor of lg*(n)
my guess is the problem is his qubits aren't orthonormal. if he modifies this and simplifies the scenario by having all the neurons involved associate with neighbors on a chance-encounter basis, then the network percolation will stay NP-complete (as far as i can tell at a glance) but lend itself to a nondeterministic turing machine-like solution the vast majority of the time.
Adding a little randomness every now and then can help them escape the local minimums and be more likely to find the global minimums.
in a GA setup, you could see how those who found a more global minimum may be more likely to reproduce, thus propogating the more global minimum solution to the population.
::shrug:: :P
Quote: Original post by AlphaCoder
i think what he is saying is that this is an attempt to model quantum mechanical phenomenon via a non-deterministic turing machine iterating over it's own recursively enumerable language.
You enjoy this trolling schtick entirely too much, AlphaCoder... ;-)
Quote: Original post by InnocuousFox
Let me get this straight... you are using random changes in a NN?
I assume that the OP is using GAs* to optimize the weights of his NN to fit some training data set.
* (I say "GAs" even though he appears currently to be using a stochastic hill-climbing algorithm which is not properly a genetic algorithm, since it involves no crossover -- but this is mostly semantics.)
---
Here's what I mean, in case it's not obvious:
A neural network is nothing but a function f which takes an input vector x and a weight vector w and returns an output y; i.e.,
f(x,w) = y .
Letting x1,...,xN and y1,...,yN be the training data, and defining the error function,
e(w) = (f(x1, w) - y1)2 + ... + (f(xN, w) - yN)2
then "training" is really just finding the weights that minimize e. As this is just a minimization problem, and Genetic Algorithms are just a generic optimization algorithm, GAs can be applied to find the optimal weight vector, which is
woptimal = argminw e(w) .
That is, the GA is used to compute the 'argmin.'
Sorry if that was a little pedantic, but I feel like genetic algorithms and neural networks need to be demystified: There's no intimate connection between the two, but GAs can be used to solve problems related to neural networks, particularly training.
Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC
Professional consultant on game AI, mathematical modeling, simulation modeling
Co-founder and 10 year advisor of the GDC AI Summit
Author of the book, Behavioral Mathematics for Game AI
Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play
"Reducing the world to mathematical equations!"
Quote: Original post by InnocuousFox
Let me get this straight... you are using random changes in a NN?
indeed, the weights are randomised to start with, then moved up and down randomly by a small amount after fitness has been evaluated. It does reliably converge.
However I did a run last night on sexual reproduction. Perhaps sexual reproduction just doesnt work for this sort neural network because the preformance has fallen by about 30%? either that or I have applied it naively.
so I will fall back now to asexual reproduction and see what happens.
I have also had the notion of reducing all weights by some small amount perhaps 0.5% - die off seems to be a method animal brains use and it makes a sort of sense.
Quote: Original post by BanEvadingGordon
I use asexual reproduction where the weights move randomly a maximun of 0.1% and convergence occures. I wonder if I should use sexual reproduction as I understand that lets the whole algorithm search the solution space cooperatively and so convergence may be quicker.
but if I do so. then the question is, how does one splut up the genes. Is it best to consider a gene that might be flipped: all that is required to write one layer or, maybe per weights. has anybody here had some success with sexual repoduction and neural networks?
Your use of the word "flipped" suggests to me that you are encoding the solutions as bit strings. Is this correct? If so, I think you might get farther with a real-encoding (each weight is a real, cross-over involves entire weights rather than bit subsets of weights and mutation is simply addition or multiplication of noise with original values).
-Will Dwinnell
Data Mining in MATLAB