by Rob Williams on December 8, 2008 in Graphics & Displays
Two weeks ago, we published a performance comparison between NVIDIA’s GTX 260/216 and ATI’s HD 4870 1GB. What we found was that NVIDIA had the upper-hand, both in performance, and efficiency. Today, we’re re-testing ATI’s card with their new 8.12 driver, to see if it can increase performance enough to sway our decision as to which is the better card.
While some popular game franchises are struggling to keep themselves healthy, Call of Duty doesn’t have much to worry about. This is Treyarch’s third go at a game in the series, and a first for one that’s featured on the PC. All worries leading up to this title were all for naught, though, as Treyarch delivered on all promises.
To help keep things fresh, CoD: World at War focuses on battles not exhaustively explored in previous WWII-inspired games. These include battles which take place in the Pacific region, Russia and Berlin, and variety is definitely something this game pulls off well, so it’s unlikely you’ll be off your toes until the end of the game.
For our testing, we use a level called “Relentless”, as it’s easily one of the most intensive levels in the game. It features tanks, a large forest environment and even a few explosions. This level depicts the Battle of Peleliu, where American soldiers advance to capture an airstrip from the Japanese. It’s a level that’s both exciting to play and one that can bring even high-end systems to their knees.
By looking at these results, you’d imagine that this was a “Meant to be Played” game, but that’s not the case at all. NVIDIA won this round, fair and square. Their card showed rather significant increases, especially at our 2560×1600 resolution. The difference was only 8 FPS, but that’s noticeable when you are dealing with > 35 FPS.
As PC enthusiasts, we tend to be drawn to games that offer spectacular graphics… titles that help reaffirm your belief that shelling out lots of cash for that high-end monitor and PC was well worth it. But it’s rare when a game comes along that is so visually-demanding, it’s unable to run fully maxed out on even the highest-end systems on the market. In the case of the original Crysis, it’s easy to see that’s what Crytek was going for.
Funny enough, even though Crysis was released close to a year ago, the game today still has difficulty running at 2560×1600 with full detail settings – and that’s even with overlooking the use of anti-aliasing! Luckily, Warhead is better optimized and will run smoother on almost any GPU, despite looking just as gorgeous as its predecessor, as you can see in the screenshot below.
The game includes four basic profiles to help you adjust the settings based on how good your system is. These include Entry, Mainstream, Gamer and Enthusiast – the latter of which is for the biggest of systems out there, unless you have a sweet graphics card and are only running 1680×1050. We run our tests at the Gamer setting as it’s very demanding on any current GPU and is a proper baseline of the level of detail that hardcore gamers would demand from the game.
Here’s one good example of a game where NVIDIA’s badge on the box doesn’t mean everything. Although their GTX 260/216 beat out the ATI card in our last article, ATI’s driver improvements helped improve things a bit here, achieving higher results in our top two resolutions.