What if the vast majority of computers in the world did not run Microsoft products? Oh, that's already the case? Hmm, who knew my phone was incapable of running Word?
If DirectX becomes the only standard then we'll be looking at phones that require a gig of memory, a few hundred gigs of mechanical hard drive and a 17" display so they can accommodate Vista.
If Microsoft adopted OpenGL as its new standard it would be MS OpenGL, which would be incompatible with the Khronos Group's standard OpenGL in the same way that no two versions of the Khronos Group's OpenGL are compatible with any other.
Committees. I hate those guys.
API War
Quote: Original post by JroggyThey drop Direct3D for something better every version. A game is not a D3D game as much as it's a D3D8, D3D7, D3D9, or D3D10 game.
I don't see Microsoft ever dropping DirectX for something "better". So many games use DirectX (specifically Direct3D), they'd be shooting themselves in the foot.
The versions are separate entities, as opposed to a tangled mess of add-ons. Functions are changed, and obsolete features are dropped like a bad habit.
Direct3D reains, but several other components of DirectX have absolutely been dropped. DirectPlay & DirectDraw are gone in modern DX IIRC. And isn't DSound either gone or being phased out?
Quote: Original post by SappharosQuote: Original post by Codeka
You may as well wonder whether Microsoft would ever abandon Windows for something "better".
There are actually rumours along those lines, but I think they're baseless. Apparently Microsoft is developing a new OS called "Singularity", and some people think it will eventually replace Windows.
Anyway, does this mean that both OpenGL and DirectX will still be running strong in 15 or so years? If so, then that's good to know, because I'm a bit afraid of spending years learning one of them and then it becoming obsolete...
Singularity will not replace Windows. It is a research operating system, which means new ideas will be testing in Singularity, and maybe incorporated in a more mature form into Windows.
[Formerly "capn_midnight". See some of my projects. Find me on twitter tumblr G+ Github.]
Quote: Original post by Sappharos
1. Microsoft decides to move over to something new, and discontinues DirectX. The new API fails to get as much support as expected, and OpenGL eventually takes over completely.
2. Microsoft includes a new feature in DirectX which becomes very popular (something like SSAO, but even more spectacular). OpenGL can't compete, and falls behind.
3.Next generation video card turns off CPU,load game data from hard (or rather flash) disk in own memory,capture keyboard,mouse and joystick input...
What's to hell a difference for user how it works? [smile]
I understand the name of the Microsoft project is Midori and it was in the news last week because Microsoft hired an expert away from Sun (Microkernel expert Shapiro to join Microsoft Midori effort).
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
Quote: Original post by Krokhin
3.Next generation video card turns off CPU,load game data from hard (or rather flash) disk in own memory,capture keyboard,mouse and joystick input...
What's to hell a difference for user how it works? [smile]
Aren't CPUs better for most tasks?
How about this for next-gen...
1. 3D goggles that display two slightly different images, one for each eye.
2. A resolution so high you can't see the individual pixels.
3. A frame rate so high it doesn't cause eye-strain.
4. 64-bit colour, so that you can see a better range of colours.
Quote: Original post by Monder
The important thing is learning the underlying concepts, once you understand those you can quite happily jump between APIs (I started off learning GL and later Cg for shaders, when I decided to do something in DirectX it took me an afternoon to learn what I needed to know to make the jump) and you'll also be able to keep up to date with the latest developments.
An afternoon? If it's that easy to learn a new API, then learning a new DX version can't be that hard.
Quote: Original post by kunos
you're right.. much better to spend 15 years at the beach and then see what happens.
I get the point. No I'm not waiting around like that, don't worry. [smile]
[Edited by - Sappharos on April 15, 2009 5:27:26 AM]
Matthew Rule
http://matthyr.wordpress.com/projects/
http://matthyr.wordpress.com/projects/
Quote: Original post by SappharosQuote: Original post by Krokhin
3.Next generation video card turns off CPU,load game data from hard (or rather flash) disk in own memory,capture keyboard,mouse and joystick input...
What's to hell a difference for user how it works? [smile]
Aren't CPUs better for most tasks?
As much as NV don't want this to be true, that is the case.
GPUs are great for massive parallel operations because that grows naturally out of their target domain. However as soon as you take that form of processing away things aren't anywhere near as quick.
Quote: Original post by Sappharos
Aren't CPUs better for most tasks?
It depends from further fate of API's,in particular-from developing of back raytrace support in GPU.
Quote: Original post by LessBreadI have a friend on the Midori team, and actually Shapiro was the professor for my operating systems course. I'm actually sorta tempted to apply myself next year. It seems like really cool stuff, even though I'm pretty vague on what their final goal is wrt target market etc.
I understand the name of the Microsoft project is Midori and it was in the news last week because Microsoft hired an expert away from Sun (Microkernel expert Shapiro to join Microsoft Midori effort).
Quote: 1. 3D goggles that display two slightly different images, one for each eye.This has been around for like thirty years, in various forms. Sony's been showing off a brand new version for the PS3 recently, in fact.
Quote: 2. A resolution so high you can't see the individual pixels.Have you seen a 22 inch 1080p display? The DPI is rather high already.
Quote: 3. A frame rate so high it doesn't cause eye-strain.Would be an awful lot more relevant if we hadn't moved over to LCDs, which don't generate frequency related eyestrain because they don't actually flicker or even ghost anymore.
Quote: 4. 64-bit colour, so that you can see a better range of colours.We're using 128 bit color for most things now, actually. The problem is that we have to tonemap back into a range monitors can handle. Considering most monitors now are 18 bit with dithering, apparently most people don't care that much about color accuracy.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement