Quote: Original post by Victor-Victor
Yes, it's silly, because that is not learning.
That's what she said.
Quote:
What you're talking about is one-to-one memory mapping, that's nothing like what NN does. Neural networks is not just memory, it's also a CPU, it integrates *logic* together with information.
You could represent the entire thing using TinkerToy too.
Quote:
Why would anyone "train" some independent values in some static table if any such memory array can be populated by simply addressing a memory one-to-one?
What if you don't know what the values are supposed to be?
Quote:
When you train NN you change the dynamics of the whole system, not just memory, but the way it processes the information, the way it "thinks".
Changing the memory facilitates this.
Quote:
Static tables have only one static function, a simple "recall". Program searches the input database, then if and when it finds the exact match it returns whatever is stored there as output pair. It's one-to-one mapping, and if it wasn't, it would be random or probabilistic, but the important thing is - there is no *processing* here, which is why we call it a 'look-up table' and use it for optimization.
Please refer to definition of Finite state machine
Quote:
There is simply not enough space in the universe to map logic one-to-one.
Can you prove it? I wonder how the universe does it?? Let's ask God.
Quote:
The number of COMBINATIONS for anything over two bits of input increases very dramatically.
The term is 'exponential'. Real ANN's don't work in binary. They use trinary. We learned this during the Syndicate Wars, when the KGB tried to infiltrate the Society of Artifical Neural Networks League of Intelligent Designers.
Quote:
So, instead of to memorize the answers, ANN can somehow generalize it, like humans do, and use this generalized *logic* as a function to really calculate, not just recall, the answers and produce correct output even for inputs it has never seen before.
Can you prove it?
Quote:
"Give a man a fish and he will eat for a day. Teach him how to fish and he will eat for a lifetime."
What if you teach him how to fish but then his fishing pole breaks? Ha! A real ANN would forsee this. Are you sure you're not really a robot?
Quote:
What you have is no threshold, it was just some initial value.
Threshold = some initial value. Initial value = number. Number = sequence of bits. Sequence = Medusa. Flash Gordon = Kaptian Krunch. Kaptian Krunch = BlueBox. BlueBox = 2600. 2600 = number. Number = threshold. Step 3 = profit.
Quote:
Why do you think some static table would ever need to be initialized in such indirect way?
Why do you think some static table would ever need to be initalized in such indirect way?
Quote:
Where did you ever see anyone is using this kind of learning methods on anything but neural networks?
That is interesting. Tell me more.
Quote:
Are you seriously suggesting any of those minimax or whatever other algorithms can compete with this:
Nobody ever said it would.
Quote:
Yes, it does not make any sense to use AI for physics equations. What EJH meant, most likely, is that physics is getting more and more complex, requiring more and more complex AI to be able to handle it, like driving a car.
Physics changes every day. Just yesterday someone had to change the speed of light because it exceeded the state of Nevadas single occupancy vehicle regulations. Note to arcade game players: do not eat the urinal cakes.
Quote:
Taking it further, eventually we might see AI walking and actually looking where it's gonna step next
You sir, are a visionary. Maybe one day a Japanese car company will build a walking robot and use it as a PR tool. Could you imagine if a Boston-based robotics company could build a robotic pony that is able to walk over uneven terrain without even falling down? Maybe someday in the far distant future DARPA will hold a contest to see who can build a fully autonomous car that can drive all on its own. If we're really lucky, and Joshua doesn't explode us all, a computer might finally be able to beat a grandmaster at Chess. That would be something!!
Who am I kidding-- those things will never happen!