If there was ever a time that PhysX was exciting, it’d be right now. Never before have I felt so confident in the technology, because it’s obvious NVIDIA is taking it very seriously, as evidenced by the many new techdemos and upcoming games as illustrated throughout the article. From hereon out, things should only get better, as adoption by game developers heats up.
The problem right now is the same one shared by AGEIA a while ago, in that the main benefit of the technology can only really be seen in techdemos. There are a few games out there to support the technology, but as far as I’m aware, there is no game that supports the physics the entire way through, except for Cell Factor and other small ‘games’.
On the other hand, games like GRAW 2 and UT III show exactly what PhysX is capable of, in addition to the techdemos that are far too much fun to goof around with. After playing with the new demos, I’d be impressed if adoption of the technology didn’t pick up far more readily in the months ahead.
Of course it’s easy to get excited when everything is currently free. The ‘PhysX Pack #1’ features lots of goodies to get you started, and since many of you out there already have an NVIDIA 8, 9 or GTX card, there’s little reason to not download it once it becomes available (unless you have a stingy ISP). Despite the fact that some of the included items have been available for a while, the new techdemos are worthy of a download by themselves.
Upcoming support is a bit slim, but as I mentioned, I wouldn’t be surprised to see things speed up. There is Metal Knight Zero, which is sure to be boring (judging from what I see in the techdemo), but it will still deliever the same goods that Cell Factor and Warmonger did. Then there is Backbreaker, an upcoming American football game that actually goes beyond PhysX for graphics and begins to handle the player AI, and from what I saw at CES 2007 (seriously), it’s could very-well be the PhysX killer-app.
Beyond that, there is Empire: Total War and Mirror’s Edge, two games that actually look quite promising, so we can only hope that they live up to the hype. With NVISION ’08 right around the corner, I wouldn’t be surprised if we learned of more developers who’ve jumped on the bandwagan either. At least we can hope.
So where’s ATI in this? Nowhere right now, although I’ve been told by NVIDIA before that nothing stops other GPU vendors from supporting it. Of course, the problem isn’t so much deciding to support it as it is the fact that they’ve already signed over to Intel to support Havok. It’s all a very sticky situation. Intel doesn’t currently have Havok functioning via hardware acceleration, but I’d be hard pressed if that wasn’t one of the features of Larrabee when it’s released in 2009/10.
Is PhysX now worth getting excited about? Of course, especially given that everything up to this point has been free. If you already own a capable card, there’s little reason to not give the new demos a try, and if you are not a gamer, how in the world did you make it so far into this article?!
I mentioned briefly on the starting page of this article that there will be a part three to this series, partly due to the fact that this particular article was put together in very short time. There are a few things I’d like to investigate further, such as memory use, and whether or not more memory will be of any benefit, CPU usage with various GPUs, more real-world benchmarking and a few other minor things. PhysX is hopefully here to stay, so such testing should go to good use.
NVIDIA’s first conference happens in two weeks, called NVISION ’08, and it’s being held in San Jose. If you are interested in attending, tickets are available on their web site (different prices depending on what you want to do). We’ll be there and reporting whatever is of interest, so definitely stay tuned. If there are any PhysX updates, we’ll make sure you are the first to know about it.
If you have a comment you wish to make on this review, feel free to head on into our forums! There is no need to register in order to reply to such threads.