Thats some powerful cannon you got there.
They have 3 gfx APIs to look at (PS4, Mantle, DX12) if they can't cook up something good this time then they should probably all be shot.. from a cannon... into the sun.
Next-Generation OpenGL design survey
"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"
My journals: dustArtemis ECS framework and Making a Terrain Generator
They have 3 gfx APIs to look at (PS4, Mantle, DX12) if they can't cook up something good this time then they should probably all be shot.. from a cannon... into the sun.
Dude, do you realize just how much delta-v that actually is? Not to mention the whole aiming and accuracy thing. If you're just even just a little then you're going to miss, and that's a bloody expensive miss.
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.
He could try to aim to something bigger than the sun so there is less chance to miss, something really big, something like... YO MAMMA.
Dude, do you realize just how much delta-v that actually is? Not to mention the whole aiming and accuracy thing. If you're just even just a little then you're going to miss, and that's a bloody expensive miss.
"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"
My journals: dustArtemis ECS framework and Making a Terrain Generator
He could try to aim to something bigger than the sun so there is less chance to miss, something really big, something like... YO MAMMA.Dude, do you realize just how much delta-v that actually is? Not to mention the whole aiming and accuracy thing. If you're just even just a little then you're going to miss, and that's a bloody expensive miss.
![5RbjYyx.gif](http://i.imgur.com/5RbjYyx.gif)
...
Ok, you got the name, now start to re-design from scratch that damn API, some suggestions:
-no more finite state machine
-better structured API
-shader compiler (yes!)
-no more extension hell
-feature levels! (without smashing it after a couple of update like MS did..)
-copy the rest from AMD Mantle
Am I asking too much?
Direct3D 12 quick reference: https://github.com/alessiot89/D3D12QuickRef/
Is there any sensible reason why HLSL can't be used? The world surely doesn't need yet another shading language.
Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.
-no more extension hell
Apparently Mantle is advocating extensions as the answer to API forward-compatibility still
Is there any sensible reason why HLSL can't be used? The world surely doesn't need yet another shading language.
There's no sensible reason. "HLSL(tm)" itself might be owned by Microsoft (hence why nVidia Cg(tm) isn't called HLSL), but Khronos can clone the syntax and use a different name.
Off on a tangent -- HLSL/Cg were developed simultaneously / in cooperation between Microsoft and nVidia, so they were originally the same language, with HLSL(tm)/Cg(tm) being names for the two company's different implementations of this shared language. It's public knowledge that Xbox360/XboxOne use HLSL, that PS3 uses Cg and that PS4 uses "PSSL", which they basically descibe as a blatant HLSL clone. That's all 4 major consoles basically using the same shader language, even though they use 4 extremely different graphics APIs.
HLSL is the C of GPU programming. Every API except for GL uses a HLSL derivative (something close enough to HLSL that it may as well be HLSL) -- every modern console/PC game engine I've worked on has written their shaders in HLSL, with a small "portability header" containing some #define's to mask the API/platform differences involved in supporting 8 APIs... Then for the Mac/Linux ports (the 9th API: OpenGL), we have to use a dodgy HLSL<->GLSL translator.
If Khronos are smart, they'll define a portable shader bytecode format, and then allow the creation of HLSL and GLSL front-end compilers, which turn our source code into this portable bytecode. Then they can keep all their GLSL work and satisfy backwards compatibility for old GL users, while allowing them to move into the modern world
. 22 Racing Series .
Apparently Mantle is advocating extensions as the answer to API forward-compatibility still-no more extension hell
.--. Didn't know @AMD like so much extensions (I do not have access to the Mantle Beta SDK)
Is there any sensible reason why HLSL can't be used? The world surely doesn't need yet another shading language.
Any resemblance to real shading languages, living or dead, is purely coincidental.
Direct3D 12 quick reference: https://github.com/alessiot89/D3D12QuickRef/
There's no sensible reason. "HLSL(tm)" itself might be owned by Microsoft (hence why nVidia Cg(tm) isn't called HLSL), but Khronos can clone the syntax and use a different name.
Off on a tangent -- HLSL/Cg were developed simultaneously / in cooperation between Microsoft and nVidia, so they were originally the same language, with HLSL(tm)/Cg(tm) being names for the two company's different implementations of this shared language. It's public knowledge that Xbox360/XboxOne use HLSL, that PS3 uses Cg and that PS4 uses "PSSL", which they basically descibe as a blatant HLSL clone. That's all 4 major consoles basically using the same shader language, even though they use 4 extremely different graphics APIs.
HLSL is the C of GPU programming. Every API except for GL uses a HLSL derivative (something close enough to HLSL that it may as well be HLSL) -- every modern console/PC game engine I've worked on has written their shaders in HLSL, with a small "portability header" containing some #define's to mask the API/platform differences involved in supporting 8 APIs... Then for the Mac/Linux ports (the 9th API: OpenGL), we have to use a dodgy HLSL<->GLSL translator.
If Khronos are smart, they'll define a portable shader bytecode format, and then allow the creation of HLSL and GLSL front-end compilers, which turn our source code into this portable bytecode. Then they can keep all their GLSL work and satisfy backwards compatibility for old GL users, while allowing them to move into the modern world
Ta, I'd suspected this was the case but my (admittedly rushed and half-baked) research didn't turn up anything.
On a slightly related tangent, and I'm not sure if this has percolated into the general awareness yet, but Windows 7 came out of mainstream support on Wednesday of last week (13th January). What this means is security updates and bugfixes will continue, but no more new functionality. In turn, this is probably the clearest confirmation we have so far that D3D 12 will not be coming to Windows 7, and so it looks as though Microsoft are going to consciously and deliberately do the wrong thing for marketing reasons rather than for technical reasons. So it's kinda even more important that the ARB make some sensible choices and don't screw up again this time.
Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.