Advertisement

What is your take on this?

Started by April 06, 2006 10:44 AM
22 comments, last by uncutno 18 years, 7 months ago
Quote: Original post by mnansgar
yet humans have only been able to survive thus far with mathematics by approximating nonlinear systems with linear models.


I disagree with this statement emphatically. I for one work in a field where I am constantly analysing complex, adaptive (nonlinear) systems and designing them, most particularly for the control of other complex adaptive systems. I certainly don't rely on linearisation when creating such systems. Indeed, in my current work I don't even rely on system identification techniques (modelling the system) when designing controllers, so there is nothing to linearise!

My personal belief is that we will see paradigm shifts in the way we describe and build complex adaptive systems. I have lots of reasons to believe this, but most of them are beyond the breadth of this discussion (and are rather mathematical). Partly though, it's because I work in this area and I'd like to think that one day all of the advances that I am seeing will actually amount to something beautiful and simple, just as many other areas of advanced mathematics do! 8)

I guess that coming from a mathematics background, I'm a bit biased though! ;)

[Edited by - Timkin on April 9, 2006 9:57:45 PM]
Quote: Original post by Sagar_Indurkhya
By Dr. Rodney Brooks, Dept Head of the MIT Computer Science and Artificial Intelligence Lab

Quote:
Newer Math?
A new high-school mathematics might someday model complex adaptive systems.

By Rodney Brooks

While prognostications about "the end of science" might be premature, I think most of us expect that high-school mathematics, and even undergraduate math, will remain pretty much the same for all time. It seems math is just basic stuff that's true; there won't be anything new discovered that's simple enough to teach to us mortals.

But just maybe, this conventional wisdom is wrong. Perhaps sometime soon, a new mathematics will be developed that is so revolutionary and elegantly simple that it will appear in high-school curricula. Let's hope so, because the future of technology -- and of understanding how the brain works -- demands it.

My guess is that this new mathematics will be about the organization of systems. To be sure, over the last 50 years we've seen lots of attempts at "systems science" and "mathematics of systems." They all turned out to be rather more descriptive than predictive. I'm talking about a useful mathematics of systems.

Currently, many different forms of mathematics are used to model and understand complicated systems. Algebras can tell you how many solutions there might be to an equation. The algebra of group theory is crucial in understanding the complex crystal structures of matter. The calculus of derivatives and integrals lets you understand the relationships between continuous quantities and their rates of change. Such a calculus is essential to predicting, for example, how long a tank of water would take to drain when the rate of flow fluctuates with the amount of water still in the tank.

The list goes on: Boolean algebra is the core tool for analyzing digital circuits; statistics provides insight into the overall behavior of large groups that have local unpredictability; geometry helps explain abstract problems that can be mapped into spatial terms; lambda calculus and pi-calculus enable an understanding of formal computational systems.

Still, all these tools have provided only limited help when it comes to understanding complex biological systems such as the brain or even a single living cell. They are also inadequate to explaining how networks of hundreds of millions of computers work, or how and when artificial evolutionary techniques -- applied to fields like software development -- will succeed.

These are just a few examples of what are sometimes referred to as complex adaptive systems. They have many interacting parts that change in response to local inputs and as a result change the global behavior of the complete system. The relatively smooth operation of biological systems -- and even our human-constructed Internet -- is in some ways mysterious. Individual parts clearly do not have an understanding of how other individual parts are going to change their behavior. Nevertheless, the ensemble ends up working.

We need a new mathematics to help us explain and predict the behavior of these sorts of systems. In my own field, we want to understand the brain so we can build more intelligent robots. We have primitive models of what individual neurons do, but we get stuck using the tools of information theory in trying to understand the "information content" that is passed between neurons in the timing of voltage spikes. We try to impose a computer metaphor on a system that was not intelligently designed in that way but evolved from simpler systems.

My guess is that a new mathematics for complex adaptive systems will emerge, one that is perhaps no more difficult to understand than topology or group theory or differential calculus and that will let us answer essential questions about living cells, brains, and computer networks.

We haven't had any new household names in mathematics for a while, but whoever figures out the structure of this new mathematics will become an intellectual darling -- and may actually succeed in designing a computer that comes close to mimicking the brain.

Rodney Brooks directs MIT's Computer Science and Artificial Intelligence Laboratory.


I've been thinking about this for sometime. I suppose one would start with a simple graph and derive a calculus to model the evolution of the graph? I'm still tinkering with the idea, and I'll write up what I figure out tonight.

Any thoughts on his proposal?




"will emerge"

If its not already been kicked around by theoretical mathmeticians for several decades, its not likely to emerge (or more likely its simply not a subject that can be reduced to a 'simple' system that can be injected into high school students). The kids these days barely get fundamentals in much simpler subjects, is it likely this will be taught ???

Advertisement
I think to the problem of adaptative systems is resources. Having unlimited, or always more than enough of them will bring us the posibility of not worring about being so mathematicaly precise to avoid wasting processing/time/energy. I don't think that nature is so worried about how things work from a mathematically point of view. Things just tend to group in certain ways under different circumstances, and the only thing that matters is if the "thing" works, and not "how", or many times "how eficiently". What does it need? A bigger brain? Have it. 40tn of weight? Done.
[size="2"]I like the Walrus best.
Quote: Original post by Timkin
Quote: Original post by mnansgar
yet humans have only been able to survive thus far with mathematics by approximating nonlinear systems with linear models.


I disagree with this statement emphatically. I for one work in a field where I am constantly analysing complex, adaptive (nonlinear) systems and designing them, most particularly for the control of other complex adaptive systems. I certainly don't rely on linearisation when creating such systems. Indeed, in my current work I don't even rely on system identification techniques (modelling the system) when designing controllers, so there is nothing to linearise!

My personal belief is that we will see paradigm shifts in the way we describe and build complex adaptive systems. I have lots of reasons to believe this, but most of them are beyond the breadth of this discussion (and are rather mathematical). Partly though, it's because I work in this area and I'd like to think that one day all of the advances that I am seeing will actually amount to something beautiful and simple, just as many other areas of advanced mathematics do! 8)

I guess that coming from a mathematics background, I'm a bit biased though! ;)



I come from an engineering background, but I'm pursuing a doctorate in computational neuroscience. So, I see your point that there exist a subset of complex adaptive systems which CAN be designed (neural networks come to mind), but I still stand by my previous statement since linear systems are/have been so important to engineering.

Practical engineering techniques heavily rely on linearization, especially if you'll agree that numerical analyses are typically just iteratively small linearizations (for instance, primitively Euler's Method). In designing modern circuits, we still rely on crude linearizations of the transistor IV curves and other solid state components. I'm sure that you're familiar with the popular sin(theta) ~= theta (when theta is small) approximation used for simplifying a multitude of formulas. You're much more an expert in automatic controllers, but I have the impression that your nonlinear work is more of the exception than the norm given the limited resources often allocated to controllers.

I can also see your rationale for emphatic disagreement -- indeed, there are many classes of nonlinear systems which we can analyze and design. However, I argue that these are typically (1) limited to classes of equations which have been rigorously studied, (2) require iterative methods which often require intuition (e.g. number of layers/hidden nodes), (3) require vast computing resources for accuracy, drastically limiting their usefulness in practical systems, and (4) even so are typically limited to qualitative/numerical analyses. I will leave on a positive note though, saying that as you indicate, their use is definitely increasing nowadays perhaps largely due to the availability of fast computing resources.

I for one would be very interested in hearing your reasons to believe paridigm shifts may occur. Thanks for sharing!
h20, member of WFG 0 A.D.
Hehe...after posting this I realised how long it had become... my apologies...

Quote: Original post by mnansgar
I'm pursuing a doctorate in computational neuroscience.


Off Topic: What's your thesis on? I've worked in CN previously, most particularly on seizure prediction algorithms and image segmentation and registration algorithms. It's a fascinating field. 8)

Back on topic...

Quote: but I still stand by my previous statement since linear systems are/have been so important to engineering.


There's a huge difference between mathematics and engineering, between what we know and what we can make/sell. For example, we teach undergraduate engineering students that the gradient of a vector field is a vector. It isn't. It's a one form. It just so happens that in Euclidean space, one forms and vectors have equivalent properties. Step outside of Euclidean space and this equality doesn't hold, meaning analysis based on this assumption would fail. Why do we teach these student a 'lie'? Because in general, they'll never need to know they weren't taught the truth. ;) If they do, then we teach them the truth of the matter (which generally only happens when they learn the error of their ways and become mathematicians! ;) )

Quote: Practical engineering techniques heavily rely on linearization


Yes, they do. In part, because we don't bother to teach engineers advanced mathematical analysis and design techniques and because linearisation works on many real world problems... but that's because most of the real world problems we deal with in every day life are quasi-linear. That's more of a statement about the domain over which engineering presides (and can survive while presiding over), rather than our inability to deal with tougher problems.

There are obviously exceptions to this in which the problems we are trying to engineer solutions for are highly nonlinear. In these cases, linearisation is often applied, but only because the engineer involved doesn't know a better technique, doesn't have the time/money to develop a better implementation or doesn't want to implement anything 'new'. I face this attitude regularly when dealing with in-house engineers of our industry partners. The tools are there though and if you look at engineering R&D, you'll certainly see nonlinear analysis and design techniques being implemented, particularly in areas like Control Theory.

Quote: especially if you'll agree that numerical analyses are typically just iteratively small linearizations


Most certainly not. If you restrict your view to only finite difference analyses of differential models/systems, then perhaps so... but you're ignoring a wealth of techniques that don't rely on any linearisation of a system model. For example: phase space analysis, spectral analysis, statistical analysis and functional analysis, to name but a few.

Quote: but I have the impression that your nonlinear work is more of the exception than the norm given the limited resources often allocated to controllers.

Certainly industry still relies on simple solutions (because they're easy to understand and sell as ideas to management), but nonlinear methods have existed since the earliest days of control. Today, there are certainly many more linear devices (such as PIDs) than nonlinear ones, but thats more to do with the inertia involved in shifting industry than any lack of knowledge regarding nonlinear methods.

Quote: I argue that these are typically (1) limited to classes of equations which have been rigorously studied


So because it has been rigorously studied, that makes it exempt in the consideration of nonlinear vs linearisation? I think perhaps you should have argued along the lines of the "size of the class of problems that have been rigorously studied". Indeed, anything beyond second order is normally the realm of mathematicians (engineers often deal with so called 'ideal second order systems')... and third order systems become tough to analyse, requiring advanced tools such as Lie Algebra and asymptotic methods... but this is only if you're trying to predict state evolution exactly. If you want to analyses the system for its performance (which is quite often all we require of engineered systems) then you don't need to know the exact state; only that it is stable and under what conditions it might traverse to instability... and that it meets a performance criteria.

Quote: (2) require iterative methods which often require intuition (e.g. number of layers/hidden nodes)

There's nothing wrong with a good iterative learning method, so long as you have the time and the data! ;) Structural learning is certainly possible, but what you find is that human intuition is often quite a good first guess.

Quote: (3) require vast computing resources for accuracy, drastically limiting their usefulness in practical systems


Yes, you have a point here, when you compare the resources required to analyse a nonlinear system compared to a simple set-point linearisation (which can be achieved by something as simple as linear regression of the local data). Certainly, nonlinear analysis techniques are data intensive and resource consuming. However, used appropriately, on many problems the increased performance far outweighs the cost.


Quote: (4) even so are typically limited to qualitative/numerical analyses.


I disagree with that. I can (almost) just as easily fit a second order polynomial model to anything you can linearise. That's neither qualitative nor numerical. If, however, I estimate the parameters of my model online, then it certainly is numerical... but then so is the linearisation.

Quote: I will leave on a positive note though, saying that as you indicate, their use is definitely increasing nowadays perhaps largely due to the availability of fast computing resources.


...and broader acceptance of techniques... which is sort of the point of Rodney's statements... that if we can reduce the science of complex adaptive systems to a description that a secondary school student can understand, then we will see revolutionary change in how we perceive and control our environment around us. Of course, this would also be true if we could teach quantum mechanics to pre-schoolers.

Quote: I for one would be very interested in hearing your reasons to believe paridigm shifts may occur.


Actually this has a lot to do with my pessimism about science and scientific method, particularly with regards to how we approach problems. Complex systems research is a good example. We bring many of our preconceptions about dynamic systems (developed from years of linear analysis techniques ;) ) to the table when we try and analyse these systems. Like much of science, we try and atomise the problem to understand it... and in parallel systems, that simply doesn't work. Yet most of these systems are comprised from simple elements, interacting with simple rules. It's the internal balance and harmony of the interplay of the components that enables these systems to survive and be observed. I firmly believe we will find a way to mathematically describe these systems, because we already have some of the important tools that open our eyes to what is going on within these systems. I just believe we're trying to describe these systems in the wrong way. We haven't worked out exactly what our tools are telling us and what we're not seeing yet. My optimisim also stems from the simplicity of the substructure of these systems. Simple elements can be described in simple ways. The complexity arises when you try and describe the global properties in terms of local properties. I think we'll find a way of doing that and that it will, at its heart, be simple and beautiful, just like the systems it describes.

Cheers,

Timkin
While a nice wish list, that's all it is. Saddly, it's not even original and I'm a bit surprised nobody has brought this up yet.

The Foundation Series, written in the 1940's, by Isaac Asimov, envisioned the type of system Mr. Brooks is looking for.

From Wiki:
Quote:
The premise of the series is that mathematician Hari Seldon has spent his life developing a branch of mathematics known as psychohistory, a concept devised by Asimov and his editor John W. Campbell. It uses the law of mass action to predict the future on a large scale, such as of planets or empires


It's a magimatical theory that allows our ficticious Mr. Seldon to determine the outcome of events involving many complex interactions many years ahead of time.

That said, it's unlikely any human will develop such a system. As humans, we're limited to human concepts and human thinking. For example, while many believe that 1+1 is a universal truth, there isn't really anything universal about it.

Concepts such as 1, or 2, might be strictly human, much like 'red' or 'hot'.

Will
------------------http://www.nentari.com
Advertisement
Quote: Original post by RPGeezus
Concepts such as 1, or 2, might be strictly human, much like 'red' or 'hot'.


Certainly for humans, the concepts related to the numbers 1 to 5 are inbuilt to our brains (not learned). That is, we can inherently recognise the difference between collections of objects where the collections have between 1 and 5 objects in them and we can do this without using any of the areas of the brain normally associated with numerical reason or number representation.

If I recall correctly, other species have similar inbuilt concepts, although the size of the sets varies. So it might be reasonable to state that while attaching labels to the sizes of sets is a human ability, there is also a non-human ability to recognise quantity, if only to a limited degree.

Cheers,

Timkin
Quote: Original post by Timkin
If I recall correctly, other species have similar inbuilt concepts, although the size of the sets varies. So it might be reasonable to state that while attaching labels to the sizes of sets is a human ability, there is also a non-human ability to recognise quantity, if only to a limited degree.

Cheers,

Timkin


Agreed. I know certain animals are very good at 'more' and 'less', but still incapable of '1', '2', '3'..

Will

------------------http://www.nentari.com
Quote: Original post by RPGeezus
I know certain animals are very good at 'more'...


Human toddlers are good at this one too ;)
You could say brains are like a program. They are given the basic syntax but then have to reprogram themselves. The question is how they reprogram themselves. I would guess that since generally most humans are simmilar at birth there must be also some "pre programmed" start up which uses the syntax they are given to evaluate inputs and outputs and set into motion the learning process. This start up program could be inherited from previous generations which would result in some kind of civil evolution.

TwinX

This topic is closed to new replies.

Advertisement