Advertisement

Genetic Algorithms and Neural Networks

Started by February 18, 2007 04:59 PM
11 comments, last by Timkin 17 years, 9 months ago
Hey all, I am interested in doing some research on GA/NNs, but the majority of tutorials are either, a) not coding related or b) too simple. I have no problem reading through those that are not coding related, but I am hoping that maybe someone has some more 'complex' tutorials, or maybe could recommend a book or two. I don't care what language it uses ... I just want to see the general design structure and implementation of GA/NN technology. I get the basics, but this field seems to ramp up quickly in terms of difficulty, which is why I would like to see example code along side the math. Anybody know anything that might fit the bill? Thanks! Corey
Quote: I get the basics, but this field seems to ramp up quickly in terms of difficulty, which is why I would like to see example code along side the math.

It is numerical analysis, so you should expect there to be a lot of advanced math. Effectively using these systems means a solid math background, there is no way around it.

I don't know of any good complex examples of source code. There are many simple code samples for various ANN types. If you understand the theories, then these very simple ANN implementations are more than sufficient.

Ultimately all you have is an array for input, an array of magically generated ANN data innards that you shouldn't modify, and a bit of simple source code that gives you a simple result from your output.

The simplicity is the power of these things. It only takes a few lines of code and a few hundred ANN-generated numbers to build an effective text recognizer that runs nearly instantly.

Quote: I have no problem reading through those that are not coding related, but I am hoping that maybe someone has some more 'complex' tutorials, or maybe could recommend a book or two.

Lots of math: Fundamentals of Artificial Neural Networks, M.H. Hassoun. It is a bit of money, but it's a good graduate level text.

Slightly less math: Machine Learning by Tom M. Mitchell.

Neither book presents source code, but if you are able to understand the math, source code for it is trivial.

Quote: I just want to see the general design structure and implementation of GA/NN technology. I get the basics, but this field seems to ramp up quickly in terms of difficulty, which is why I would like to see example code along side the math.

Good simulations: Neural and Adaptive Systems: Fundamentals Through Simulations, by Principe et al.

Here is a big warning: This book feels like an expensive ad for their commercial software. You have been warned.

This book is basically a tiny bit of explanatory math and a whole lot of simulators, which are limited versions of a commercial product. The tools let you build and execute your own systems, stepping through them and monitoring them as they run. The GUI lets you construct very complex networks, and view graphs and details about each network node while setting breakpoints and other nifty things. It is useful if you have trouble visualizing what is going on inside the pieces.
Advertisement
fup's site and book(AI techniques for game programming) have some pretty good information. The book comes with alot of source code and it's easy to read, but it's not very math heavy. It focuses more on the implementation and use of ANNs.
http://www.ai-junkie.com

NNs are easy to code, but difficult to understand why/how they work. The advanced tutorials probably don't cover coding because they assume that if your far enough to want to actually understand why and how everything works you've probably already written NNs or know how to code well enough to do so from the theories.
From my experience, both GA's and NNs are fairly easy conceptually. It should only take a few tutorials and a little time to understand all the basics. It's almost better to start with these simple examples instead of very complex ones. The reason being, that both are things that can be applied to a number of problems in a number of different ways. Once you reach beyond the basics, there really isn't a 'standard' way of doing things. They are more just tools that you can tweak, change and play around with until you find what works best for you.

For instance, most books will talk about back propagation as the way of training a ANN. There are plenty of other methods that work in a variety of situations, such as using a GA or simulated annealing. I found that just playing around with these two models was the best way to learn. You will benefit more if you don't look at a books way to solve the problem, but think about it for yourself, and then come up with a method you think will work. Chances are, it will to some extent. After that, you can look back and see what the 'real' solution is.

That being said, for ANNs I would recommend Neural Smithing. It's basically a ANN book that does exactly what I stated above. It gives a brief description of the multilayered perceptron neural network, and shows how to train it using backprop. That is what most books/tutorials on the subjects will cover. However, this book then goes in a different direction. From that point, it starts to talk about all kinds of 'mods' you can do to your network, to help it learn better and train faster. It introduces topics such as momentum, pruning, weight decay, etc. It's kind of like a ANN cookbook.

I took a grad level Machine Learning class which went in depth in ANNs. By far, the most beneficial lecture given, was when the professor covered the 'tips & tricks' covered in this book. Just 10 or so little modifications that helped ANNs perform better. My notes from that day were so useful, I have since copied and laminated them. It's really interesting how just simple changes can have an impact on how the network works. For instance, rather than using the sigmoid activation function that is the 'standard', you can (usually) achieve faster learning by using the tanh function, and properly scaling the results. The table of contents for the book is viewable through amazon if you are interested.

One thing that you might find confusing about documents on neural networks, and (to a slightly lesser extent) genetic algorithms, is that both fall under the much larger category of Machine Learning. This is an area which involves a lot of background knowledge and domain-specific terminology. If you try to approach NNs without understanding this stuff, you won't have a lot of the context in which to understand the information covered. So spend some time getting to know machine learning; once you have that background (and it isn't all that much stuff to learn) understanding neural networks takes fifteen minutes, tops.

The solution, BTW, is NOT some "Neural Networks For Dummies" which strips out all the ML stuff. That would be a stupid idea because it would presuppose that NNs are actually what you want (which they usually aren't) and that the generalized machine learning stuff is not important to your application (which it always is).
I'm currently taking a class in Machine Learning. Here's the Amazon.com clicky for the text book. Its pretty heavy in the mathematical details, but does a good job of covering the popular Machine Learning algorithms.
Advertisement
Thanks for all the info guys! This should be more than enough to satiate the brain for the time being!
I've heard that AI Techniques for Game Programming does an excellent job of covering GAs and NNs, but I haven't read it myself so I can't tell you for sure.
The problem you are facing in trying to find a book that covers the math and the code is that the code conveys no information that the math doesn't already provide. Contrary to what programmers think (that they'll learn better if they see it in some code), code only shows you how one person interpreted the problem and implemented it. That doesn't teach you about ANNs or GAs, it teaches you coding skills (the ability to analyse a problem and cast it into a computational scheme), which you really should have before trying to tackle the implementation of AI.

Learning AI is NOT about learning how to code AI.

Quote: Original post by Timkin
Contrary to what programmers think (that they'll learn better if they see it in some code), code only shows you how one person interpreted the problem and implemented it. That doesn't teach you about ANNs or GAs, it teaches you coding skills (the ability to analyse a problem and cast it into a computational scheme), which you really should have before trying to tackle the implementation of AI.

Learning AI is NOT about learning how to code AI.



I disagree with that statement. Code provides an interactive example of a theory in action. This is vital to understanding something. By writing their code and using their example and then debugging their work you can most definitely learn and understand their implementation. From there, it is typically easy to understand other peoples implementations and to be able to come up with your own. To say you can't learn from code is like saying you can't learn by example.


This topic is closed to new replies.

Advertisement