Found my account, the three most recent AP are mine.
Quote:
Original post by C0D1F1ED
Quote:
Original post by Anonymous Poster
Rendering HDR in HL2 had quite a hit on my 6800 GT. So if HL2 would do its physics calculations on the GPU aswell, what would happen?
Why would you do that in the first place? Half-Life 2 has typically no more than a dozen of moving objects.
Very true, I was just giving an example of something that puts a strain on the graphics card and how the physics wouldn't exactly benefit that.
Quote:
Original post by C0D1F1EDAnyway, if you'd do it on the GPU it would lower the framerate a bit, obviously. But the real question is, what's most cost-effective? If you sold your Geforce 6800 GT today for 150$ and added 250$ (the price of a PhysX card) you'd be able to buy a Geforce 7800 GTX, Geforce 7900 GT, or a Radeon X1900XT. Or for the price of a PhysX card you can buy a second Geforce 6800 GT for an SLI configuration. I'm sure it would be able to do the physics processing and get you much higher graphics performance.
Yeah, there's always the cost of the card.. Unless of course it picks up and gets really popular. I think you would need to upgrade graphics cards more often than the PPU, but that remains to be seen I guess.
Quote:
Original post by C0D1F1ED
Quote:
Do you really expect next-gen games, that need to look next-gen to get sales, are willing to downgrade how they look to be able to process the physics on the GPU, so that their next-gen game is actually playable on the next-gen graphicscards?
Next-generation games will have more advanced physics no doubt but they can't expect mainstream gamers to pay 250$ just to enjoy them. GPU and/or CPU based solutions will be necessary to make the games popular. If Unreal Tournament 2007 demands a 250$ PhysX card for advanced physics and Quake Wars runs fine without it, what do you think will be the most popular game? Actually... that's hard to answer because physics alone don't sell games. And that's exactly the reason why they won't take the risk.
But they (probably) won't require the card in the first place since Ageia has a software fallback. And as for the risk involved, do you mean they're risking something by supporting Ageia? Ghost Recon: Advanced Warfighter has support for it, and I really hadn't heard of Ageia for long before I saw GR:AW. I think the developers just took a look at the benefits; by using Ageias' physics, they also support dedicated physics hardware. If the user doesn't have the card, it falls back on their software. Is there really that much risk involved in that?
Edit:
Quote:
Original post by C0D1F1ED
How would they tell the gamers with a high-end Direct3D 10 card and/or a Core 2 Duo that the game can't run at high quality because they didn't bother using the capabilities of this GPU/CPU to the fullest? Rephrased: What would they tell their customers when other games do manage to display superb physics and graphics without PhysX?
What I doubt, and I would love for anyone to tell me how this will end up, is if another game can display physics and graphics that are better than an equal graphicscard and a PPU.. That's really the only argument I have.
I've only seen one Ageia demo movie, the CellFactor one. The action displayed in that movie, alone, makes me want to shell out the cash for a card. It says it is multiplayer, it has A LOT of debris and objects flying around that (atleast seemingly) affects the other players. How they solved syncronization issues, I have no idea. But if this trend continues, lots of physics which matter even in multiplayer, thanks to PhysX, I'm all for it.
If it never gets better than that movie, then I'm not so sure I would buy it.
( really struggling to update my post, getting server errors all the time:\ )
[Edited by - Tunah on May 26, 2006 4:30:28 AM]