Advertisement

Our brains really DON'T work like computers!

Started by June 29, 2005 10:44 PM
51 comments, last by nas1982 19 years, 4 months ago
I was just about to site measuring temperature as a counterexample, but Songoku you may be right. Interesting observation.
He is right about the phenomena that have been observed thus far. But we still haven't reached the bottom yet....

Sure, the brain can be simulated. But how can you say it is not complex? Ever try to write code that figures out what the brain figures out? I'm sure you'll say "Duh, look at the forum name". But think again. AI thus far has simulated intelligence in confined, simplified, trivial models of reality (i.e., games). Try to perform some of the continuous operations the brain performs outside of that game, in reality. It would take a physical modeler just to figure out how to apply forces to limbs just to walk, and if you haven't done any physics engine coding, you have no idea how monsterous the math and algorithms can get.

You can eventually understand the brain, but you must learn to look for the perfection in techniques and optimization it employs. Treating it as poorly laid out garbage is not going to prepare you to for the rigors of the endeavor of simulating this masterpiece.
Advertisement
We can simulate flying pretty well, so far as I know that's more complex than ourselves. Anyway, I have thought that myself at times. I don't think it's true, but maybe it is depending on how we define complexity.

And as far as a counterexample to discrete units of real things, I can't think of a counterexample.
Continuous or discrete, its more a problem with perspective. Our brain is capable of directly "processing" analog and continuous information through our senses. HOWEVER, all storage is done in a discrete form to save space. The "sampling rate" for the storage actually varies directly depending on how important the thing is. The reconstruction process then interpolates the discrete info into continuous info.

A computer, on the other hand, deals completely in discrete information. All information going in, are discretized before processing. Then storage is discrete as usual. The reconstruction process is done in complete discrete form and then outputted into a discrete medium. Yes, all monitors are discrete as well. Its just that most of the time the information is refreshed fast enough such that we can't tell (our senses can't keep up). Case and point, some people still see scan lines on CRT monitors even at refresh rates of up to 120Hz.

As for my earlier post about the brain being symbolic in nature, I was referring to the general storage unit in the brain. The fact is, the brain is capable of storing all sorts of information as one storage unit. A color, a number, a letter, an idea, an object can all fall into one storage unit. And then those units become the things we work with and manipulate. So, say someone shows you a room. Instead of remembering exactly what the room looks like photgraphically, we tend to remember where specific objects were located, their relative size and color, and maybe a rough estimate of the dimensions. So, when you're redecorating, you may not know what the whole room will look like in the end, you always start out by moving the pieces around mentally until you get something you like. So, fundamentally, the unit of storage and manipulation for the brain is very symbolic and piece-wise, not to mention slightly random.

As for the brain being an optimized system, well, I have to disagree. Have you tried to keep a coherent continous stream of thought about one single thought without thinking of something else for like 5 - 10 minutes? It takes quite some concentration. Its called an attention span, which by last research I saw, shows that the average internet user only has an attention span of 6 seconds (the equivalent of a goldfish as they say). The brain is neither optimized nor efficient, but it works pretty well.
Off topic but slightly related: Has anyone else downloaded the Human Genome "chr#.fa" files and tried converting them to optimal binary storage (taking the ascii A, C, G, T and converting each quartet into a byte)?

Try dumping the beginning of such a conversion out on the screen in ASCII, old-school DOS style. Now do the same thing for a .ZIP, .RAR or any other binary compressed data file.
Hi.

Mayby you might find the following link useful as well.
It is a paper that describes "A Theory of Visual Attention" and a mathematical implementation of TVA.

http://www.psy.ku.dk/cvc/TVA/Theory/index.htm

(best read with a freshly brewed cup of coffee...milk and sugar optional)
Advertisement
That study is stupid.

Of course our brains can anticipate appropriate words, if we hear the syllable "can-". Even Google can do it, check out Google Suggest: http://www.google.com/webhp?complete=1&hl=en

And all the signals from our senses are trasmitted to the brain as electric current, where individual electrons can be counted. Totally discrete.
Quote: Original post by Anonymous Poster
And all the signals from our senses are trasmitted to the brain as electric current, where individual electrons can be counted. Totally discrete.


Wrong, signals from our senses are sent to the brain as cascading shifts in the ionic concentrations of the axonic membrane, with a few purely chemical relays along the way. There are no single electrons involved in movement (only ions) and even then, the only movement they ever do is traversing the axonic membrane.

Also, it's wrong to say that if you can count the elements of an object, it must be discrete. Any analogical object around (microphones, phonographs, oscilloscopes, wrist watch, CRT) is comprised of a finite number of particles, and yet it is not discrete: this is because in addition to the discreteness of particles, there is also the continuum of spatial positions.

The most interesting thing is that the human brain is microscopicallycontinuous in its functioning, while the computer is, by design, microscopically discrete.

Before moving on, I want to mention the fact that I am not using here a concept of absolute continuity, but rather of relative continuity. For instance, to the naked eye, a high-resolution numeric photograph is continuous, but it is discrete if you look at it closer. A good definition of continuity is therefore that a function is continuous with regards to a means of observation if it does not appear discrete to the observer, as limited by the resolution of its senses.

Therefore, both brain and high-level computer operations can appear as continuous to unequipped humans: we lack the capacity to distinguis pixels from a given distance, or to determine that sounds played by a computer are not real, just as much as we lack the ability to distinguish discrete steps in the evolution of someone's personality or behavior.

However, on a microscopical scale, the brain is still continuous, while computers are discrete: they handle limited amounts of bits of information, which can be either on or off, and only process them at fixed intervals of time. Brain cells rely very heavily on spatial and temporal continuity of the action potentials: while there is only a finite amount of action potentials moving around at any given moment, it is impossible at this very large scale to distinguish discrete steps in their spatial positions or their times of arrival.

On a fundamental scale, though, both space and time become discrete: objects only occur at discrete positions in space, and things only happen at discrete positions in time.

What is really important to us then? Obviously, we want computers to appear MORE continuous on a high-level scale. How? Discreteness is all about "holes" in the set of observed properties. For computers, there is no possible value between "0" and "1" for any given bit. However, it is possible, by adding more discrete data, to make the holes smaller. For instance, a fixed-point value of 32 bits only has holes of size (1 / 4000000000) if used to represent numbers between 0 and 1. If the size of a hole becomes lower than the precision of the observation tool, the function will appear discrete.

- Above 60Hz, the human eye cannot distinguish discreteness in movement. Computers can show us moving things!
- Below a hundredth of a millimeter, the human eye cannot distinguish discreteness in texture. Giant screens and high-resolution projectors, and computers can really show us movies.
- Below (insert correct figure) Hz and Watt, the human ear cannot distinguish frequencies. Therefore, if the standard error in replaying a sound is higher in frequency and lower in power than these figures, we cannog distinguish a real sound from a computer-played one.

Due to the very low precision of our sensory organs, computers can ALREADY appear continuous to us, even though through precise enough tools (computers, books, knowledge) we can determine that they are in fact discrete.

When will we be able to simulate human behavior, then? (Note, I do not mean simulating the brain, with artificial neurons and such, I only mean passing the turing test on a significative scale) Why is it so difficult to imitate a human? Remember what I said above about making discreteness appear continuous? Everything comes from making holes smaller. While holes in movement, color or sound are easy to measure and diminish, holes in personality, knowledge, reasoning or emotion are very hard to define, let alone isolate and shrink! Making computers act like humans would require us to first understand how humans react (not necessarily what their internal mechanism of reaction is: only external observations), and to find ways to measure objectively and quantitatively the difference between a human's reactions, and computer-simulated ones. Trying to simulate human behavior without first being able to understand it is like a painter trying to create a resembling representation while blindfolded.

And this is where appears the main difficulty of this approach: by becoming able to measure the difference between simulated and natural behavior, we acquire a new observation tool, to which the computer reactions will appear discrete, just like a modern-day camera can distinguish between real images and images on a monitor (the latter flicker, because the "simulation" is good enough for human eyes only). In the same way, the human behavior simulation would only appear continuous to humans, but we'd still know, when we're told, that everything in there is still discrete.

Of course, there are other approaches: one could create a human simulator iteratively: therefore, each step of the sequence would be able to tell that the next one is discrete, but only the last step would be known to humans, and to us it would appear continuous. But still, how to evaluate, even with human means, that the sequence is "getting close" ? Or we could dump the idea altogether and go with continuous, analogic computers...

</rant>
I don't think it was a rant. I thought it was darn good. However, I am still skeptical that space is continuous. There are a lot of assumptions in this world we take for granted. Anyway, good post.
I believe, that the world is predictable. Firstly, there is no such thing as random... Everything happens as a result of millions of little variables interacting with each other to create this result yes. So if you were able to somehow "universe save" or get the base variables at the time the universe was created and chuck them into your computer and allow it to grow/evolve just as if it were the real world based on some rules of the universe then if you got every little thing correct it would turn out just as what has happened. Everything is just based on numbers and even the human brain can be accurately represented, just we have not got the power yet to do so. So what I'm saying is that it would be hard to model just ONE brain... our computer brain will always be limited until we are able to set it up into a virtual world it can explore. Uhh pardon if my statement is a little obscure or foolish as I haven't attended university nor am I doing science in highschool. But feedback welcome
What we do in life... Echoes in eternity

This topic is closed to new replies.

Advertisement