Advertisement

function approximation methods

Started by October 29, 2006 10:53 AM
11 comments, last by Rockoon1 18 years ago
Quote: Original post by Kylotan
But let's not forget that many do not need a fine dining table, only a flat surface they can place a plate upon. :)


If all you want is an approximation to a solution (perhaps so you can get a feel for the rough properties of the solution), then there are simpler methods than ANNs to get them. If you are looking for a solution to a function approximation problem, then an ANN is almost never the best solution. It's just one that works approximately in many situations. In that, yes, it's like tossing a piece of chipboard over two A-frame work horses to make a multi-purpose flat surface. It might be approximately flat, but you probably wouldn't want to serve up a dinner party at it or use it for a game of pool.

Quote: Neural nets and genetic algorithms are certainly overhyped, but there is definitely something to be said for simple, well-known approaches over more accurate solutions that are more complex to understand and implement.


This is one of my personal irks... that most people assume that alternative methods are more complex to understand and implement. This is certainly not the case. Spline methods, for example, are far easier to comprehend and just as easy to implement as an ANN. What scares people off is that many (if not most) of these other methods are not written down in 'lay person books' and they have to hunt around for them in more technical resources. Most people therefore assume that they're hard.

Of course, most of this can be tracked back to societies general aversion to mathematics... but that's a story for another day! ;)

Cheers,

Timkin
Quote: Original post by Timkin
If all you want is an approximation to a solution (perhaps so you can get a feel for the rough properties of the solution), then there are simpler methods than ANNs to get them.


What is simpler than a sum of weights multiplied by inputs? The hardest bit is understanding the calculus behind the training and lots of people even skip that by using a genetic algorithm instead!

Quote: This is one of my personal irks... that most people assume that alternative methods are more complex to understand and implement. This is certainly not the case. Spline methods, for example, are far easier to comprehend and just as easy to implement as an ANN.


I would expect that they're only easy to comprehend and implement if you have a good grasp of the underlying mathematics, which most people don't have and never will have. I can't see a single site on Google that shows how to use splines for function approximation in any sort of form accessible to non-mathematicians.

A lot of the suggestions you post here are often relatively advanced, which of course is not surprising given your level of education. But I don't think you can expect everybody to understand them all, or do the same amount of background reading that you have in order to do so.
Advertisement
Quote: Original post by Kylotan
A lot of the suggestions you post here are often relatively advanced, which of course is not surprising given your level of education. But I don't think you can expect everybody to understand them all, or do the same amount of background reading that you have in order to do so.


Its only 'advanced' because he took it to the logical conclusion.

The idea is quite simple and its hard to imagine a simpler method: Regularly sample the problem space and store the samples in a table, and then interpolate between them for the desired approximation.

If he had suggested linear interpolation instead of splines, someone different probably would have jumped on it for being an overly simplistic (and inferior) solution. Don't blame him for the catch-22.

(Personally I would be happy with a simpler cubic interpolation rather than splines as there is only small gains going beyond cubic)

This sort of approximation method is well suited to most problem domains, doesnt require any machine learning algorithms, and is trivialy implimented. In short, its usualy the best solution for function estimation.

Leave the machine learning algorithms for cases where you cannot trivialy sample the problem space.

This topic is closed to new replies.

Advertisement