Your View on Programming
@squared'D: J.A.R.V.I.S is what the acronym means and it's not a programming language. I know they've not gotten that close to imitating the movie but the fact that they've gone that far on their own funding is amazing.
I clearly stated i am still learning so i could not do much about this, yet.
@chaosengine: if(when) i want to do this, i probably won't be doing it alone.
UNREAL ENGINE 4:
Total LOC: ~3M Lines
Total Languages: ~32
--
GREAT QUOTES:
I can do ALL things through Christ - Jesus Christ
--
Logic will get you from A-Z, imagination gets you everywhere - Albert Einstein
--
The problems of the world cannot be solved by skeptics or cynics whose horizons are limited by the obvious realities. - John F. Kennedy
With almost 7 billion people living on Earth at this moment, I think pretty much every viable ideas are already stolen, even before the idea is born in your head.
I had the idea that we could use evolution algorithms in machine design, where the shape itself is generated or at least guided by evolution. Already done before my birth.
I had some other ideas which are already taken, I can't recall them at the moment.
I know the feeling too well of being high from an idea of my own. So high that I feel the world is mine, I will be on TV, I will be the next [insert popular scientist here]. Then the realization comes that it was already done, or it's not that good idea at all.
If you are making threads like this on a forum like that, chances are very high that you are not the next [insert genius programmer/inventor name here].
If you would be the ONE, you would be too busy to spend any time reading/writing BS on forums.
Thanks.
I believe the technical term for ideas of this ilk is "pipe dream".
Not a pipe dream but a kindergarden dreams, i feel a sudden (and rapid) 10X downgrade in my personal inteligence when reading this
I remember my first post on this site. It was about modifying a game engine. I needed this for a game i needed to make but i changed my mind and decided to make it from scratch. I knew that the game assets will increase the time spent making a game and i got an idea. It was a tool that enabled you to create any asset with mouse clicks. I protected this idea and saw it as great. I posted something about it on this forum but didn't go into detail on it's functionality. It was going to be the best tool for 3d stuff and it existed no where. People here told me i was being unrealistic and it wasn't possible. That was until i was given a link to 3sweep and to my greatest shock, i saw my pipe-dream in full detail. It did exactly what i had in mind.
I've had ideas, many ideas. That's why i asked the question on the possibility of J.A.R.V.I.S. Fir said it is a kindergarten dream. Google has proven every not-possible/pipe-dream wrong. J.A.R.V.I.S isn't just possible, it is in version 0.2. The team of less than 20, mostly below the age of 23 making Jarvis OS prove that. Yet another pipe-dream made reality. Their team isn't nearly as large Microsoft but they've done what Microsoft hasn't and couldn't and weren't discouraged by the fact that no one or tech giant including Microsoft and Apple has done it and it was seen as a pipe-dream and their team was small. They designed a whole new OS architecture different from what exists. Jarvis Corp.
I wish i started programming earlier. I would have been the one that made those. This idea is one of the only ones i've had that google hasn't shown it exists. I don't see my ideas as hard, infact, they are easy because people with more experience are making my ideas real. A month ago, i would not share this much detail about an idea because i didn't want it to be stolen but i am. I'm waiting for the article or link to something showing this current idea in production and hoping no one is doing it in secrecy.
Imo the trouble here is that you behave like 13-year old (the conceited one)
You dont know nothing and like to talk all the time what great things (1000x greater than real people here) you can do
- IMO you are not primerely interesting in programming but in boasting yourself how good you 'can' be (everyone can do it but most people find it to silly to really do it)
- the problem is not that you dream how good you can be but that you publish (and try to impose) this highly questionable truth in the probable maneer.
If something is strictly possible but highly questionable (let say the juice carton on my table wil become president of the usa) discussing it in a way you do is silly, more than silly it is some kind of a logical crime,
what if you will waste many hours of people time trying to put them in the head highly questionable things (made fool of them) and then it become clear that this didnt happen, will you pay them back for the modal promises and all the time spoiled?
Right now youre only good @ one thing, turning this forum form real coding topics (where you need experience and intelligence) into some kind of shit talking
As diverse as all of the programming languages are, they all have one thing in common: They're all here to solve a particular problem.
In other words, they'll always be structured to abstract what the underlying "machine" is capable of doing, be it some form of virtual machine or hardware.
I'd look forward to a language that implicitly handles threads. Something that compiles your program and figures out on an intermediate stage what the best way is to distribute the workload on to multiple cores, so you don't have to break your head over threading models and just focus on your program.
On that note, I'd like to see a language that's more multitasking oriented. We are after all in a world with hardware rapidly expanding the number of cores and threads, rather than working on increasing linear processing speed like it used to be 10 years ago.
I can also imagine a variety of new languages popping up to handle quantum computer hardware, when that finally makes it to the commercial scene.
I'd look forward to a language that implicitly handles threads. Something that compiles your program and figures out on an intermediate stage what the best way is to distribute the workload on to multiple cores, so you don't have to break your head over threading models and just focus on your program.
On that note, I'd like to see a language that's more multitasking oriented. We are after all in a world with hardware rapidly expanding the number of cores and threads, rather than working on increasing linear processing speed like it used to be 10 years ago.
I think the asynchronous approach used by C# and a few other languages (async/await) might actually be a first step towards this. What it does is basically let the programmer specify data dependencies (in the sense that the programmer *has to* because the syntax makes it quite obvious that you have to wait for the results of a request or computation to actually arrive before using them), and when it encounters one, it backtracks up the call stack recursively finding any remaining work that can be done without pending dependencies. Then as soon as a task (or a "future" in some languages, if you prefer) is complete, execution down that code path can continue. If properly coded it would seem like a program would be limited only by waiting for any of a set of individual tasks to complete, some of which will be I/O bound and others CPU-bound - and they can be concurrently executed in a scalable fashion. As long as the program doesn't have insane dependency relationships, that is. This is not new, of course, but I've always found support for this kind of data processing paradigm lacking in traditional imperative or object-oriented languages, where you sort of have to roll your own without a proper framework to build on top of.
The Go concurrency model (and related) is interesting as well, but I still find those rather unwieldy and arcane (though I should probably just use them more instead of complaining).
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”
As diverse as all of the programming languages are, they all have one thing in common: They're all here to solve a particular problem.
In other words, they'll always be structured to abstract what the underlying "machine" is capable of doing, be it some form of virtual machine or hardware.
I'd look forward to a language that implicitly handles threads. Something that compiles your program and figures out on an intermediate stage what the best way is to distribute the workload on to multiple cores, so you don't have to break your head over threading models and just focus on your program.
On that note, I'd like to see a language that's more multitasking oriented. We are after all in a world with hardware rapidly expanding the number of cores and threads, rather than working on increasing linear processing speed like it used to be 10 years ago.
I can also imagine a variety of new languages popping up to handle quantum computer hardware, when that finally makes it to the commercial scene.
Labview is already mentioned. Though it wouldn't prevent you from making all the typical lame threading mistakes, it does make multithreading much easier. Threading is something I wouldn't have even considered learning before getting into Labview, but multithreading is pretty much a default thing in the applications I have to make (autonomous high frequency tests with a pretty heavy eye-candy GUI on top and very important logging under the bottom. All this run on a regular PC, not a real time PC).
Labview is pretty good in that. It sucks in other important fields though.
As I said I'm not any expert in multithreading, but the importance of data dependencies are mentioned in the thread.
Since Labview is a dataflow oriented "language", it is all bout data dependency. A program block is not executed if not all inputs of the block are calculated/"appear" , but as soon as the data is available, the block is executed "asynchronously" (not in a particular order). Of course, the block diagram is compiled and the blocks are executed in some order, but the possibility is there to force particular blocks to run in different threads. Loops if they don't depend on each other's dataflow are run in different threads by default (and can be forced to run on different cores). Communication between the parallel loops are done with the common methods with their common pitfails, but it's much easier to handle. I usually just use notifiers and queues in my current projects, which are very easy to use and work pretty fine.
I hope I didn't make too many mistakes in my quick explanation.
I can envision touch and speech being used more and more, it makes sense to do so. For example lets say you could just say into your mic, 'I want a 512x512 dialog box with xx buttons and I want you to create a MS SQl db and connect it to that'
Well you get the idea. Speech is a very weak area as far as I can see. There is something in this, somebody will work it out.
Yes! That is the keyword "connect." That is why I find gamepress so brilliant. It makes logical sense, and you can see the flow of the program quite literally.
Perhaps such a system would have some language like c++ underneath at first, and then someone might come along and create a whole new node based language.
You can see how the data flows (I guess the reason the call it flow based).
Input-output. On-off. True-false. Those are the basic terms of logic, and node based programming, like a logic board is, I think, the way programming should have been, and perhaps will be.
They call me the Tutorial Doctor.
(++2*2)^2 to both of you.Yes! That is the keyword "connect." That is why I find gamepress so brilliant. It makes logical sense, and you can see the flow of the program quite literally.I can envision touch and speech being used more and more, it makes sense to do so. For example lets say you could just say into your mic, 'I want a 512x512 dialog box with xx buttons and I want you to create a MS SQl db and connect it to that'
Well you get the idea. Speech is a very weak area as far as I can see. There is something in this, somebody will work it out.
Perhaps such a system would have some language like c++ underneath at first, and then someone might come along and create a whole new node based language.
You can see how the data flows (I guess the reason the call it flow based).
Input-output. On-off. True-false. Those are the basic terms of logic, and node based programming, like a logic board is, I think, the way programming should have been, and perhaps will be.
That's what i meant.
Programming with complete and utter ease. (i prefer visual/visual based to node/node based).
Speech version: Jim, perform system update and make me a 3d game engine and while you're at it, hack my phone and present all loop holes in a tabular format.
Touch version: you move the parts of the underlying game engine around.
Just imagining :).
UNREAL ENGINE 4:
Total LOC: ~3M Lines
Total Languages: ~32
--
GREAT QUOTES:
I can do ALL things through Christ - Jesus Christ
--
Logic will get you from A-Z, imagination gets you everywhere - Albert Einstein
--
The problems of the world cannot be solved by skeptics or cynics whose horizons are limited by the obvious realities. - John F. Kennedy
I've been a professional developer since I left university 11 years ago. Back then, people were asking "what will programming look like in 10 years". And in those 10 years, programming hasn't changed much. Game development is still largely done in C++; application development may have shifted to .NET/Java a lot more, and there is so much more focus on web-based applications, but the actual coding is very similar. We haven't all moved to using visual tools instead of writing code, hell we still write raw SQL queries rather than have it all abstracted out of sight.
So I'd be pretty cautious about making any predictions of big changes in the next ten years.
If you are using .NET then you are unlikely to have a need for raw SQL queries.
Id say programming is changing quite a bit, especially in my field of scientific computing. Staticaly compiled libraries are quickly losing ground to specialized JIT compilers for various programming domains (lots of activity there in the python ecosystem). Language interop is becoming the norm, even for small projects.
I once believed in 'one language to rule them all', but now I think seemless interop is much more important for the future. In some contexts, functional semantics are great. In some, nothing but a pain in the ass. In some context, dynamic programming techniques are a ridiculous slowdown. In some, a very useful feature.
Rather than cramming that all into one language, which is never going to happen elegantly, id say its all about the interop. Python and .NET have the right idea here.
is never a possibility in tech?Rather than cramming that all into one language, which is never going to happen elegantly
UNREAL ENGINE 4:
Total LOC: ~3M Lines
Total Languages: ~32
--
GREAT QUOTES:
I can do ALL things through Christ - Jesus Christ
--
Logic will get you from A-Z, imagination gets you everywhere - Albert Einstein
--
The problems of the world cannot be solved by skeptics or cynics whose horizons are limited by the obvious realities. - John F. Kennedy