Quote: Original post by daviangelYes, Dekatron-based machines were popular for a period.
I don't even know if a Base 10 Digital Calculater (non-biological/mechanical) exists?
Programming changed? Forever?
[Website] [+++ Divide By Cucumber Error. Please Reinstall Universe And Reboot +++]
Quote: Original post by Programmer OneQuote: Original post by way2lazy2care
I think if anything is going to "change programming forever" it's going to be the shift from low level programming languages to high level programming languages.
For the most part, this has already happened...what are you talking about?
I mean it's going to keep shifting more and more that way. What we see as high level today is going to be the low level of tomorrow.
If you look at some drag and drop programming interfaces like Kodu or Kismet in UDK(sorry I don't know of many more because I haven't been interested in them :-/) you'll get the idea of what I mean. Where you no longer worry about really any code at all.
While there are tools that have systems like this, I wouldn't really call them mainstream across all programming. I think it's going to start moving that way more and more.
Couldn't memristors be used to create a artificial neural network that would rival the complexity of the human brain? They seem very well suited since they could calculate floating point values using variable current in real time instead of binary calculation one at a time over many cpu cycles.
By having a normal binary computer with a ANN coprocessor chip it would have all the advantages of traditional computing with AI capabilities that could easily rival a mouse or cat.
By having a normal binary computer with a ANN coprocessor chip it would have all the advantages of traditional computing with AI capabilities that could easily rival a mouse or cat.
Quote: Original post by Kaze
Couldn't memristors be used to create a artificial neural network that would rival the complexity of the human brain? They seem very well suited since they could calculate floating point values using variable current in real time instead of binary calculation one at a time over many cpu cycles.
By having a normal binary computer with a ANN coprocessor chip it would have all the advantages of traditional computing with AI capabilities that could easily rival a mouse or cat.
as far as I know a normal human brain can't do that. The normal human brain is actually supposed to be a pretty bad computer iirc. It's just a really good rememberer.
Quote: Original post by way2lazy2careQuote: Original post by Kaze
Couldn't memristors be used to create a artificial neural network that would rival the complexity of the human brain? They seem very well suited since they could calculate floating point values using variable current in real time instead of binary calculation one at a time over many cpu cycles.
By having a normal binary computer with a ANN coprocessor chip it would have all the advantages of traditional computing with AI capabilities that could easily rival a mouse or cat.
as far as I know a normal human brain can't do that. The normal human brain is actually supposed to be a pretty bad computer iirc. It's just a really good rememberer.
A brain is a terrible computer for binary logic tasks like calculating PI to a thousand places but great for fuzzy logic tasks like pattern recognition.
Strangely enough, I just started reading Clarke/McDowells's "The Trigger" and they talk a bit about "solid state memories".
[size="2"]I like the Walrus best.
Quote: Original post by mikeman
Or those äugmented reality" things, which you supposedly turn to the Eiffel tower and you get "layers of information"? As if you can't just google Ëiffer tower" while at you're on-site anyway? It's a bit silly.
Dunno... Can see some useful applications. People said the same thing about the mobile phone when it came out, and look where we are now.
I don't think changing the base would change the way we program. We already mostly program around the base 8 and 16 anyway. Binary is just the mechanical part. Maybe they'll design logical circuits making use of base 3 or above for arithmetics, but the improvement will be in efficiency, size and speed, it's not gonna change programming. Maybe electronics and circuit design.
Programming electronic brains with fuzzy logic is another matter. But we're already doing that (neural networks). How the neural network circuitry is implemented is irrelevant.
Everything is better with Metal.
Quote: Original post by Kaze
Couldn't memristors be used to create a artificial neural network that would rival the complexity of the human brain? They seem very well suited since they could calculate floating point values using variable current in real time instead of binary calculation one at a time over many cpu cycles.
By having a normal binary computer with a ANN coprocessor chip it would have all the advantages of traditional computing with AI capabilities that could easily rival a mouse or cat.
What you're describing is just analog circuits. An Opamp is just multiplicator, basically a voltage multiplier.
Analog has disadvantages when you want to perform precise arithmetics. Like noise, errors creeping in the signal, ect... You need a way to quantise the signal, which basically fall backs to converting it to digital.
Everything is better with Metal.
Quote: Original post by Programmer OneQuote: Original post by way2lazy2care
I think if anything is going to "change programming forever" it's going to be the shift from low level programming languages to high level programming languages.
For the most part, this has already happened...what are you talking about?
Not really though -- we only think this is the case because we've been told that things like C++, C# or Java are high-level languages -- but its all relative. On the scale ranging from 'manipulate these bits at the gate level' all the way up to 'hey computer! make me a program that does X, Y, and Z' we are decidedly much nearer to the former -- we still describe programs almost exclusively as a step of distinct and atomic steps to produce the desired result, we used to do this with switches, then punch-cards, then assembly language, and now we do it with the for-each loop.
Functional programming languages, for example, are much higher-level that the C-likes because the emphasis is on data transformations as compositions of higher-level functions and there is an established algebra for manipulating and optimizing those compositions. Unfortunately it seems never to have caught on outside a niche group of uses, and people seem unwilling or unable to make the mental leap to functional programming for the most part.
Declarative programming is sort of the ultimate thing on the horizon, but that is a very distant horizon for general-purpose tasks so it is limited today to very simple things.
Functional/declarative-ish features are becoming more and more mainstream, however. Lambda's in C++, Linq in .Net and SQL are good examples of this trend. Languages over time are moving on the path towards "here's your input, here's the output I want, I don't care how you get from A to B", but even with all the advancements we've made we're still relatively near to where we started.
throw table_exception("(? ???)? ? ???");
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement