He wants the basic version to have 16 nodes running at 2 gig hertz producing 3 teraflops
3 teraflops? Perhaps I'm just being ignorant, but a single nVidia GTX 590 already has that processing power.
He wants the basic version to have 16 nodes running at 2 gig hertz producing 3 teraflops
3 teraflops? Perhaps I'm just being ignorant, but a single nVidia GTX 590 already has that processing power.
I may have got the numbers wrong, it's a while since I talked to him.
He had a sort of developer breakdown, quit his job, spent some time making bowls out of wood, moved to Leeds.......
I was more interested in the inter-node communication he was designing than the overall speed. It optimises the network topology on the fly.
Still interesting to have that kind of horsepower from a different architecture. Like phantom was pointing out -- GPUs are great if you've got 64 items that you want to run the exact same code on, but having something as powerful as a GPU but which acts halfway more like a CPU would be very handy.3 teraflops? Perhaps I'm just being ignorant, but a single nVidia GTX 590 already has that processing power.He wants the basic version to have 16 nodes running at 2 gig hertz producing 3 teraflops
. 22 Racing Series .
So does anyone here think we'll be using quantum computers on an useful level in our lifetimes? Ex: supercomputer, mainframes, NASA/ISS/Mars Colony Central Computer, ... anything. And by lifetimes, I assume everyone on this thread is living until 100, at least.
2) Virtual transistors - Why can't coding emulate the digital transistor better than it does?
A transistor is just a switch, on or off, so I suppose that software already does this, converting binary into readable decimal number output.
And even in games and 3d modeling / CAD there hasn't really been much for the past 5 - 8 years that has really required a massive increase in power.
Talking about CAD and engineering specifically, there hasn't been anything truly revolutionary come out because they don't want to rewrite the old code. The geometry kernels that CAD use were basically created in the 1970s and updated/modified periodically until today. Some finite element solvers have existed since the 1960s in one form or another. The task of rewriting them from Fortran into something like C++ is so daunting no one really wants to do it, let alone reconstruct the algorithms to utilize the powerful GPU. Engineers are all about getting solutions and they don't really care if it takes a long time to do it.
Games have been pushing the envelope of technology to its limits, but there are problems in engineering that require a massive increase in power that games can benefit from. Personally, I'd like to be able to create a 3D model, apply loads and deform it, and get the real engineering results back in real-time.
2) Virtual transistors - Why can't coding emulate the digital transistor better than it does?A transistor is just a switch, on or off, so I suppose that software already does this, converting binary into readable decimal number output.
Not really just a switch, I could lecture you for hours about switching transients and gain factors. The question is unclear to begin with.
They call me the Tutorial Doctor.
A lot of Engineers are using Sketchup Pro which uses Layout for design sheets. The old CAD method can already be scratched. And sketchup can be expanded with custom plugins. I've done a bunch of research on it.
The problem with engineering is what happens when things go wrong and the bridge collapses and the lawsuits start flying.
Engineers don't have the luxury of basically absolving themselves of all responsibility in an EULA.
"Oh, you designed it with Sketchup Pro. Why weren't you using industry standards? I'm sure if you were the design wouldn't have been flawed..."
"The multitudes see death as tragic. If this were true, so then would be birth"
- Pisha, Vampire the Maquerade: Bloodlines
They call me the Tutorial Doctor.