PhysX is getting a lot of attention right now, but the reasons vary wildly. Since we haven’t taken a look at the technology in a while, this article’s goal is to see where things stand. We’ll also be taking an in-depth look at GPU PhysX performance, using both 3DMark Vantage and UT III.
In the three full years we’ve been following the progress of PhysX, it doesn’t seem to have moved too far. We kept hearing promises of PhysX-equipped PC games, but the only ones to actually see the light of day have either been canceled (Auto Assault) or were more of a tech demo than a real full game (Warmonger, Cell Factor). Other than that, most of the support has been seen in console gaming.
AGEIA needed PhysX to be picked up by a company with deep pockets in order to see their technology truly succeed, and I think we are on the right road now.
As it stands, Unreal Tournament III is the only game that supports PhysX with the help of NVIDIA’s latest GPU and PhysX driver. We’ve been told that more support is on the way, but I’m not sure what games would be included. As mentioned before, most of what we’ve seen so far have been not much more than tech demos, so whether or not Warmonger and Cell Factor will be supported, I’m unsure. It wouldn’t be much of a loss, though. Ghost Recon is one confirmed to have support added, however, so fans of that series can rest easy.
As for upcoming titles, the names are far from plentiful, but now that PhysX is fully supported on the GPU, adoption is sure to look more appealing to developers all over. Backbreaker is one title in particular I’ve seen from AGEIA in the past, which is simply put, a PhysX-enabled football game. From what I remember, the physics were really amped up here, and colliding players looked incredibly realistic.
So while we don’t have the content right now, it seems likely that more developers will decide to pick up on the technology now that support is more widespread. Before, you needed an add-in PPU in order to experience the benefits, and no gamer (understandably so) wanted to pay $100 for such a reason. With the PhysX capable of running off your GPU, it blows the gates open for widespread adoption.
NVIDIA will be holding their first ‘visual computing’ convention this coming August, NVISION 08, where we will hopefully learn a lot more about the technology and support. That would be the perfect venue to announce developers who’ve signed up to support PhysX, and it’s likely that it might very well happen.
With these new drivers, support is added for PhysX on NVIDIA’s three top-end GPUs, the 9800 GTX and GTX 260 / 280, but where’s the rest of the support? For whatever reason, support for other models took a little more effort, but they are not forgotten about. We should be seeing new drivers next month that will open up support for all of the 8-series and 9-series GPUs.
But another predicament arises. Because PhysX slows down gameplay, and requires a fair amount of your GPU power to run, do you really want to enable the feature on lower-end models? It might all depend on the game and resolution, but if support is built in, it will cost you nothing to test it out and see if it works for you. Like gaming in general, the faster the GPU, the better it will be able to handle physics calculations without much of an FPS loss.
In my tests, the 9800 GTX suffered in losing 10FPS off of what the PPU could help with, but it might be that the GTX 260 or GTX 280 would portray differing results. There’s no two-ways about it… those cards are powerful, so I have a feeling that running PhysX off of them would affect your overall FPS a lot less than what we saw here.
But even as it stands, for our GPU which retails for around $230, our game ran 66% faster with running the calculations off the GPU over the CPU. These numbers should only continue to improve, as the GPUs become more powerful and the number of stream processors increase.
How about support on ATI cards, or others? I contacted NVIDIA’s Platform Products PR, Bryan Del Rizzo, on this and was told:
“PhysX is an open platform and supports both the CPU and GPU across a wide variety of platforms. This is why developers are choosing PhysX as their physics development platform. You would have to ask our competitors about their plans for GPU physics as well as CPU physics.”
I contacted AMD and Intel regarding their thoughts on PhysX, but both preferred not to comment.
What’s next for PhysX is yet to be seen, but with the recent deployment of GPU support, the outlook is good. PhysX up to this point hasn’t been that impressive, but now, game developers might take the technology a lot more seriously. If they add support, it means a huge amount of people will be able to take advantage, rather than the five or six people (sarcasm intended) with the PPU card.
As for ATI support, it’s a tough one to speculate on. If PhysX can run off NVIDIA’s GPUs, then there is no reason it couldn’t run off of ATI’s. Whether or not ATI will consider the technology is unknown, but it’s a ponder made tougher by the fact that they are now working with Intel on Havok. If Intel and AMD use accelerated physics with Havok, like NVIDIA is doing with PhysX, we might very well be in the middle of another weird technology war. Both are likely to co-exist for awhile, but it will take a few years before we see which one picks up the most steam.
It’s something we’ll be keeping a close eye on, though. One thing is certain… the next few months should be quite interesting as far as gaming and physics is concerned.
If you have a comment you wish to make on this review, feel free to head on into our forums! There is no need to register in order to reply to such threads.