Quote: Original post by Tunah
Yeah, there's always the cost of the card.. Unless of course it picks up and gets really popular. I think you would need to upgrade graphics cards more often than the PPU, but that remains to be seen I guess.
If it becomes popular then indeed cheaper models will come. Unfortunately it takes years to optimize hardware (in terms of performance/cost) and there are many hurdles. Just to illustrate today's efficiency, note that they're using a fan to cool PhysX P1 while my Geforce 6600 is passively cooled. By the time they've made it affordable for the mainstream gamer, NVIDIA/ATI/Intel/AMD will have G90/R700/P9/K8L ready.
Quote: But they (probably) won't require the card in the first place since Ageia has a software fallback. And as for the risk involved, do you mean they're risking something by supporting Ageia? Ghost Recon: Advanced Warfighter has support for it, and I really hadn't heard of Ageia for long before I saw GR:AW. I think the developers just took a look at the benefits; by using Ageias' physics, they also support dedicated physics hardware. If the user doesn't have the card, it falls back on their software. Is there really that much risk involved in that?
Good question. What they risk is that people expect the advanced physics to run on their powerful system even if they don't have a PhysX card. Just have a look at this screenshot: Without/with PPU. I don't think it's unreasonable to have those few extra particles in software mode as well. The NovodeX Rocket demo shows several hunderds of object at very high framerates in software, without SSE optimizations!
For GRAW I must add that the true risk is probably low because they're so early and the difference between software and hardware physics in this game is frankly not that big. It's not 'gameplay' physics. But for say Unreal Tournament 2007 I don't think it will be tolerated if they don't offer the best perfomance and quality combination for every system configuration. If another game shows that it's possible to have advanced physics on a next-generation GPU or CPU then that can be a real problem when it's mentioned on each and every review site.
Quote: I've only seen one Ageia demo movie, the CellFactor one. The action displayed in that movie, alone, makes me want to shell out the cash for a card. It says it is multiplayer, it has A LOT of debris and objects flying around that (atleast seemingly) affects the other players. How they solved syncronization issues, I have no idea. But if this trend continues, lots of physics which matter even in multiplayer, thanks to PhysX, I'm all for it.
If it never gets better than that movie, then I'm not so sure I would buy it.
Fair enough. Here's one interesting quote from VR-Zone about CellFactor though: "However, this realism also means one thing: It is very taxing on your graphics card. Running at 1280x1024 without Anti-Aliasing or Anisotropic Filtering, our 7900GTX setup only managed an average of 37 frames per second, seeing a minimum of 8fps as recorded by FRAPS". That's a very powerful graphics card, a modest resolution and low quality setting. Yet the framerate is very dissapointing. On the same site this graphics card renders Half-Life 2: Lost Coast at 1600x1200 4xAA 16xAF at 66 FPS! This makes me wonder whether maybe it's actually limited by the PhysX card. Unfortunately I couldn't find any CellFactor review using a less powerful graphics card...