🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

End of the world

Started by
126 comments, last by Calin 4 years, 3 months ago

@Green_Baron At least theoretically anything can be emulated. And anything can be represented in binary.

Advertisement

@nikito : A computer is, from a certain level upwards, deterministic as long as it works without flaws, the brain is not. Or, at least, it is seen as very unlikely that it is deterministic.

And as a world view, determinism like for example in this statement is a belief, not natural science. Modelling makes processes seem to be deterministic, but irl uncertainties apply and set limits to the observability of physical processes. Insofar, deterministic descriptions carry an artefact in them. Don't get me wrong, the models are good enough and some near ideal approximations ! I am in no position to criticize anything natural science has produced !

NikiTo said:
Compilers are AGI, dude!!!

Interesting.

I maintain the toolchain used at my company to build the software used in self-driving cars. I can say with a great deal of confidence there is no general AI in either the toolchain (compiler, linker, assembler, etc) or in the self-driving car. Nor in computer vision or speech recognition or any of the other popular applications.

I am intimately familiar with the math behind how a compiler turns your source code into machine code. It's math, pure and simple, not generalized artificial intelligence. It has simple costing tables used to calculate register allocations instructions choices and understands various superscalar pipelining architectures through the use of target-specific code fragments. It's just a bunch of pre-programmed rules written by humans and applied blindly by an algorithm.

It's possible, though unlikely, that a human could write better assembler code than the compiler emits, but it would probably take a lifetime for that human to do so for a generally useful application. Most programmers couldn't do it: they're cargo-cult programmers who followed in internet tutorial on how to use CSS to change the text on a screen.

Even today's neural networks are not anything like a general AI. They're a way of building a Bayesian decision table using programmed input (Bayesian statistical analysis was developed in the 1700's) and then using Markov chains to produce “artificial" output. This math is centuries old, and it's not general AI but math. Mathematics, as in Pythagoras, not AI as in the Terminator. The only thing new is the speed of the processing and the amount of information storage available at that speed. Modern technology is really really good at doing simple dumb things really fast, and that makes it sufficiently advanced to be indistinguishable from magic/AI.

Stephen M. Webb
Professional Free Software Developer

@Green_Baron I am not talking about 16 bit PC or 32 bit Mac here.

I am talking about the concepts. Theoretically, in a hardware agnostic way, in a programming language agnostic way, anything can be improved and anything can be emulated.

The implementation matters not.

It is an universal truth of data science that ANYTHING can be represented in binary. Binary is the mother of all data.

As soon as we demystify a “thing” we can emulate it inside a program.
The only problem is we don't know how brain works exactly. And our scientists fail to agree upon a single universal definition of general intelligence.

The trick many celebrities of science use to fool people is to say: “It is only a matter of numbers. As soon as we have a supercomputer with the same number of neurons as a human brain has, we will have AGI.”

Elephants have twice more neurons than humans. Crows with their tiny, pity, silly brains are in the top 3 of animal intelligence. Intelligence is a mystery, but not a matter of numbers.

Don't forget that they used bacteria to compute in parallel. Don't forget about the group intelligence of an ant colony. There are intelligent creatures in nature that lack a brain, without even a single neuron. There are single cellular organisms with eyes inside. A river that is not alive finds the easier path to break through. Is pure water intelligent? It can find optimal paths.

There is more intelligence out there aside of brain and neurons.

AGI is a highly philosophical field.

NikiTo said:
At least theoretically anything can be emulated. And anything can be represented in binary.

So, is it possible to represent the difference between a living person and their dead body moments later? It would be a simple equation. Since all binary representations are members of the set of counting numbers, and the set of counting numbers can be ontologically projected onto the set of integers, it's a simple subtraction, right? That would give you the binary identity of the soul (or its equivalent in your framework). So, in theory, all we need to do to create artificial life is figure out what that number is, and add it to something.

Easy peasy.

Unless, of course, your initial assumption is incorrect and it's not possible to represent everything in binary.

Stephen M. Webb
Professional Free Software Developer

@Bregma I was highly sarcastic about compilers. It happens to me that if i open the mouth about ASM, everybody hits me in the face with: “I don't know you at all, i know nothing about you. But i know for certainly-sure that a compiler is more clever than you at writing ASM”. It is a little bit insulting…

I agree with you. Except for “sufficiently advanced to be indistinguishable from magic/AI”. I am not impressed by AI demos.

An “AI” "learns" to play the ping pong game…. A person can program an algorithm that plays ping pong in the best possible way using ray tracing and a bit of physics simulation. It can literally destroy at playing ping pong one of these “AI”s that they pretend are so awesome.


You represent Zero in binary, but zero does not exist.

Using binary i can write down “That person is death now and what happens in his brain is only noise.” Actually your and my comments are stored in binary now.

Your argument is invalid. Non existing and purely theoretical concepts can be represented in binary too.
(Maybe you think about binary made of one bit only, no. I would say "represented in boolean" then. I mean any combination of any number of 0s and 1s. It is the basic foundation of data)

NikiTo said:
Using binary i can write down “That person is death now and what happens in his brain is only noise.”

You have mistaken the map for the territory.

Stephen M. Webb
Professional Free Software Developer

@nikito : The use of body chemistry as by ants has absolutely nothing to do with intelligence, it is not even tool use. It is evolutionary adapatation, no cognitive process is involved, like grass incorporating silicate grains to withstand grazing by horses. Individuals who did it had a better chance to proliferate and so the trait was promoted over the generations. It is evolution. Also the ad hoc use of e.g. a wooden pick by a raven to scratch its feathers is not considered tool use or intelligent application. Only when it goes somewhere, bends the stick, and then returns to poke into the bark for insects, we can speak of intelligence.

Modelling is not defining a struct and filling it with data or programming in general at all, that's play at the computer and yes, there you can abtract anything away (“away” is important here) and turn it into 0s and 1s. Because the computer is a deterministic machine as long as it doesn't give in to entropy, but by no means is that a description of a natural process like e.g. a rainshower. To describe that you must step away from determinism and introduce probabilities. Quantizing these in a program is allways a loss of accuracy and sets limits to the model and its validity. Though it works good enough for daily use.

Calin said:

@joej you have to be balanced about it, the approach you`re talking about is like taking a chunk of raw data (bits) and trying to figure out, without any clues, if the code is for 16 bit machines, 32 bit machines, the Operating System and other such details.

What do you mean? The approach of brain modeling to achieve AGI? I propose you confront the scientists that work on this, telling them they have no clues of what they do or how brains work at all.

But be prepared to make a proposal on how to do better. And i'd like to hear that myself as well.

Notice we even lack a definition of intelligence, consciousness, reasoning etc. This should give you a hint about of what AGI could mean eventually or not. You won't get any insights from watching action movies like Terminator where things appear as just given and so easy.

Calin said:
JoeJ and NikiTo stay rationale.

All my points were rational. Feel free to ask if you missed it, but no need to give orders.

NikiTo said:
purely theoretical concepts can be represented in binary too

Are you familiar with the set of real numbers? It is not possible to represent the set of real numbers using binary, assuming by binary you mean the kind of representation a digital computer can deal with. Now, one could argue that at a theoretical level nothing in our perceptive universe needs real numbers since at some quantum level everything is discrete, but they sure are a useful functional model of the universe and their manipulation (things like calculus, trigonometry and other transcendental functions, statistical analysis, rocket science, and so forth) have proven to be awfully productive. Besides, I would introduce to you the concept of a circle. I see a bubble in my coffee, it is a circle. You can not represent that accurately in a computer, you can only approximate it. Just like a dead body is approximately alive within a certain accuracy.

If you think the fixed-point-with-mantissa representation used by modern digital computers to approximate real numbers is good enough, you're one of the fools I deal with daily.

My point is that there is an infinite amount of real stuff in the universe that can not be represented using binary (or, more accurately, using a digital computer).

Stephen M. Webb
Professional Free Software Developer

This topic is closed to new replies.

Advertisement