Date: December 8, 2008
Author(s): Rob Williams
Two weeks ago, we published a performance comparison between NVIDIA’s GTX 260/216 and ATI’s HD 4870 1GB. What we found was that NVIDIA had the upper-hand, both in performance, and efficiency. Today, we’re re-testing ATI’s card with their new 8.12 driver, to see if it can increase performance enough to sway our decision as to which is the better card.
Two weeks ago, we were challenged by NVIDIA to take a hard look at their GeForce GTX 260/216 card, and pit it against its leading competitor, ATI’s Radeon HD 4870 1GB. Their claim was that the GTX 260/216 would beat ATI’s card in every-single test, in the games they chose. Given that the titles they chose are also the top hits for the season, their challenge intrigued me. It ended up becoming a piece of content I had a lot of fun with, and if you read the article, then you already know what conclusions were drawn upon.
As much as ATI fans would have liked to see the HD 4870 1GB perform a bit better, NVIDIA proved to have the better hand in almost every test conducted, including non-performance ones such as power efficiency, temperatures and of course, price. I mentioned in that article that I’d follow-up once ATI shot their 8.12 Catalyst drivers to reviewers, and after spending half of the weekend on testing, all of the updated results can be found right here.
One reason NVIDIA dominated the last article was thanks in part to their 180 driver. That particular driver carried performance promises that would help their cards be pushed ahead of the ATI equivalents, and that’s exactly what happened. Due to time, though, I wasn’t able to test with many games outside of the five that NVIDIA requested, all except for Crysis Warhead. The time issue was still a factor with this article, but I did manage to test both cards with two more games: S.T.A.L.K.E.R.: Clear Sky and Unreal Tournament III.
Neither of those two games are new, but we wanted to open the field a bit more, and see if ATI’s performance on their HD 4870 is generally going to see worse performance throughout most games available. It’s important to note also that four of the games being tested carry NVIDIA’s “The Way It’s Meant To Be Played” logo. These include Crysis Warhead, Dead Space, Far Cry 2 and Fallout 3. On ATI’s side, the only game to carry their logo is S.T.A.L.K.E.R.: Clear Sky.
When we generally see this happen, it’s usually safe to say that the game will favor that particular manufacturer’s card. As we’ll see in our results today, that’s going to be the case more often than not. While I don’t believe that either ATI or NVIDIA pay developers to optimize for their card, they do offer help, and from what we see from recent marketing, NVIDIA is clearly more proactive at this than ATI.
I already covered all the bases in the last article, so I won’t repeat myself except to say that we have some pretty equal competition here. As we saw in the last article, during our price comparison, NVIDIA won consistently, with their card averaging $30 less than ATI’s. It’s hard to ignore facts like that, so it’s no surprise that so many ATI fans were interested in seeing how the 8.12 drivers improved the situation. It became obvious that ATI had to do something, and the most logical thing would be to adjust prices. ATI did confirm to us that price cuts are en route, but that’s about all the information we could be given.
Oddly enough, the article from two weeks ago is one that caused a lot of debate around the web, in all places except our forum, where the discussion was rather tame. There’s an obvious passion hovering around any subject that involves both ATI and NVIDIA, and it’s amusing to read all of the varying opinions. In one particular forum, we were accused of not doing enough testing, and I hope we can remedy that by including the two extra titles here today.
The fact is, we could exhaust every possibility, but that doesn’t serve much purpose, except to take more of our time. In order to give a general overview of which card is better, we chose eight games to use, along with 3DMark Vantage, and I truly believe that even that much is overkill, but it’s definitely a good way to better draw conclusions.
Let’s get this show on the road, then, and take a look at ATI’s latest drivers, to see if real improvements will be seen. As always, our testing methodology and system setup is listed on the following page, and I highly recommend you read through, if you haven’t before. Direct screenshots of the setting screens of each title used is also shown here, so you can see exactly what we are running with.
At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing.
If there is a bit of information that we’ve omitted, or you wish to offer thoughts or suggest changes, please feel free to shoot us an e-mail or post in our forums.
The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used. Each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, it will bring you to the product on the manufacturer’s website.
For each card, we used the absolute latest driver version possible. For NVIDIA, we used 180.48, which became a significant release for the company. It brought multi-monitor SLI, dedicated GPU PhysX and general performance enhancements to the table. ATI’s 8.11 Catalyst driver wasn’t as significant of a release, but their upcoming 8.12 is aiming to deliver important performance increases, similar to NVIDIA’s release.
Intel Core 2 Extreme QX9770 – Quad-Core, 3.6GHz (Overclocked), 1.35v
ASUS Rampage Extreme – X48-based, 0501 BIOS (08/28/08)
Corsair XMS3 DHX 2x2GB – DDR3-1333 7-7-7-15-1T, 1.91v
Diamond Radeon HD 4870 1GB (Catalyst 8.12)
XFX GeForce GTX 260 / 216 896MB (GeForce 180.48)
When preparing our testbeds for any type of performance testing, we follow these guidelines:
To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows Vista from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.
For graphic card reviews that pit us with a mid-range card or higher, we test at three popular resolutions that span the mid-range to high-end ground, consisting of monitor sizes of 20″ (1680×1050), 24″ (1920×1200) and 30″ (2560×1600).
In an attempt to offer “real-world” results, we do not utilize timedemos in our graphic card reviews, with the exception of Futuremark’s automated 3DMark Vantage. Each game in our test suite is benchmarked manually, with the minimum and average frames-per-second (FPS) captured with the help of FRAPS 2.9.6.
To deliver the best overall results, each title we use is exhaustively explored in order to find the best possible level in terms of intensiveness and replayability. Once a level is chosen, we play through repeatedly to find the best possible route and then in our official benchmarking, we stick to that route as close as possible. Since we are not robots and the game can throw in minor twists with each run, no run can be identical to the pixel.
Although for most of our GPU content we run through a given setting twice, we upped that number to three here, since we are doing a direct head-to-head and want to fine-tune our results as much as possible.
The eight games we chose for this article are shown below, with direct screenshots of their respective setting screens.
| || || |
| || || |
| || || |
| || || |
While some popular game franchises are struggling to keep themselves healthy, Call of Duty doesn’t have much to worry about. This is Treyarch’s third go at a game in the series, and a first for one that’s featured on the PC. All worries leading up to this title were all for naught, though, as Treyarch delivered on all promises.
To help keep things fresh, CoD: World at War focuses on battles not exhaustively explored in previous WWII-inspired games. These include battles which take place in the Pacific region, Russia and Berlin, and variety is definitely something this game pulls off well, so it’s unlikely you’ll be off your toes until the end of the game.
For our testing, we use a level called “Relentless”, as it’s easily one of the most intensive levels in the game. It features tanks, a large forest environment and even a few explosions. This level depicts the Battle of Peleliu, where American soldiers advance to capture an airstrip from the Japanese. It’s a level that’s both exciting to play and one that can bring even high-end systems to their knees.
By looking at these results, you’d imagine that this was a “Meant to be Played” game, but that’s not the case at all. NVIDIA won this round, fair and square. Their card showed rather significant increases, especially at our 2560×1600 resolution. The difference was only 8 FPS, but that’s noticeable when you are dealing with > 35 FPS.
As PC enthusiasts, we tend to be drawn to games that offer spectacular graphics… titles that help reaffirm your belief that shelling out lots of cash for that high-end monitor and PC was well worth it. But it’s rare when a game comes along that is so visually-demanding, it’s unable to run fully maxed out on even the highest-end systems on the market. In the case of the original Crysis, it’s easy to see that’s what Crytek was going for.
Funny enough, even though Crysis was released close to a year ago, the game today still has difficulty running at 2560×1600 with full detail settings – and that’s even with overlooking the use of anti-aliasing! Luckily, Warhead is better optimized and will run smoother on almost any GPU, despite looking just as gorgeous as its predecessor, as you can see in the screenshot below.
The game includes four basic profiles to help you adjust the settings based on how good your system is. These include Entry, Mainstream, Gamer and Enthusiast – the latter of which is for the biggest of systems out there, unless you have a sweet graphics card and are only running 1680×1050. We run our tests at the Gamer setting as it’s very demanding on any current GPU and is a proper baseline of the level of detail that hardcore gamers would demand from the game.
Here’s one good example of a game where NVIDIA’s badge on the box doesn’t mean everything. Although their GTX 260/216 beat out the ATI card in our last article, ATI’s driver improvements helped improve things a bit here, achieving higher results in our top two resolutions.
Not to be confused with the general mass that floats around inside my head, Dead Space is a cross between Quake IV and Resident Evil, offering survival horror action that’s surprisingly great. In fact, almost all of the game’s initial reviews have praised the title for its story (rare to see), gameplay and overall experience. It might be a Resident Evil in space, but EA showed that they could still impress even in a relatively over-saturated genre.
The game is innovative in more than one way, with the main being the awkward view of the player. Rather than seeing the camera situated directly behind the player, it’s instead pushed slightly to the left, for more of an over-the-shoulder view. It’s odd at first, but after a while of playing, it grows on you. Another notable innovation is the absolute lack of a HUD, except when needed. Seems minor, but the experience looks far better without a HUD filling up a quarter of the screen.
The level we use for testing is “New Arrivals”, which is Chapter 1 of the game. This is based off an NVIDIA-based saved game file that begins us 50 minutes in the game, and gives us a perfect opportunity to do a quick run through a certain area of the ship. No battles take place, but we do shoot a gas canister along the way, because after all, explosions are fun.
In the last article, NVIDIA dominated ATI in Dead Space, and not a thing changed here. It isn’t so much that NVIDIA had better results here, but the word “Domination” comes to mind. As I mentioned in the previous article, some oddity between the combination of our test monitor (Gateway XHD3000) and NVIDIA’s card didn’t work well. It’s not that 2560×1600 didn’t work on NVIDIA’s card, it’s that it wasn’t even a selectable option. Why it works with ATI’s card, I have no idea. I’m still researching the issue.
If there is one game that’s been hyped up this year for the PC, it would have to be Fallout 3. Building on the foundation that has captivated countless fans, the third game in the series instantly became a well-respected action role-playing game… a genre we rarely ever see great games from. Bethesda did a masterful job here.
It’s not often that a PC game gets released that delivers over 50 hours of gameplay, but this is one of them. In fact, if you beat the game in 30 hours, you probably didn’t play it right. I’ve even seen accounts people who’ve played it for 80 hours. Now that’s value! The game takes place in 2277, with you as a survivor of a nuclear ‘fallout’. Your father mysteriously disappears, and it’s up to you to find him. Along the way, you meet many friends, kick a lot of ass, and take in a world that’s very dirty, but still fun to explore at the same time.
The area in the game we use for testing is Dupont Circle, and the test consists of a general walk through the area, with light combat along the way, in addition to a few good explosions… something there is a lot of in this game. You can see the heart of the area in the screenshot above. To make sure our character doesn’t die during the play through (there are a lot of landmines), we use God mode.
ATI’s driver improvements shine through here, but that in itself isn’t what’s so interesting about these results. Here, we see an odd effect where ATI’s card performed worse at lower resolutions, when compared to the opposing card, and vice versa at the high-end. When I noticed this oddity while preparing the results, I hopped back on the testing machine to retest both cards at each resolution, and the results didn’t budge. For whatever reason, ATI’s card performed better at the high-end, while NVIDIA’s performed better at the lower-end.
Sequels are common, and four of our six games used here prove it. But what’s different with Far Cry 2, though, is that while the other sequels here don’t throw you for a loop when you first load it up and generally give you what you’d expect to see, this game does the absolute opposite. We knew for months that Far Cry 2 wasn’t going to be a direct continuation of the original, but for the most part, this game could have gone by any other name and no one would even make a connection. Luckily for Ubisoft, though, the game can still be great fun.
Like the original, this game is a first-person shooter that offers open-ended gameplay, similar to S.T.A.L.K.E.R. You’ll be able to roam the huge map (50km^2) of a central African state which will mostly be traversed by vehicle, as walking even 2% in any direction gets very tedious after a while. This game is a perfect GPU benchmark simply because the graphics are better than the average, with huge draw distances, realistic nature and even a slew of animals to pass by (and kill if you are evil enough).
Our run through takes place in the Shwasana region, and consists of leaving a small hut and walking towards four people prepared to kill me for no apparent reason (except that this is a game). After the opponents are eliminated, a walk along the dirt road continues for another twenty seconds until we reach a small hut with supplies.
Nothing changed with NVIDIA’s card here, and it’s still the better of the two, although ATI’s card did do a great job of catching up.
Not too many game publishers can brag about having such a great track record like Valve can. None of their major game releases have ever been released to anything but praise, which goes to show that not rushing to release a game to please investors can make a huge difference. Take Half-Life 2, Team Fortress 2 and Portal, for example.
Left 4 Dead is one game I didn’t take seriously up until its launch. After playing it though, my opinions changed drastically, and even as I type this, I feel like saving the document and going to play. But, I’m also scared of Zombies, so continue writing I shall. Like Dead Space, this game is a survival shooter, but unlike that game, this title focuses completely on co-op. For the most part, the game is dulled in single player, but team up with three of your friends and let the laughs and excitement begin.
The portion of the level we use for testing is contained within the No Mercy campaign. The ultimate goal in the entire campaign is to make it to the top of a hospital in order to be picked up and brought off to safety. Our run through takes place in the final part of the the campaign, which leads up towards the roof tops. If one thing can be said about this title, it’s that causing a Boomer to explode (as seen in the above screenshot) proves to be one of the most satisfying things to do in any game I’ve played in a while.
NVIDIA’s card here once again proves to be the superior, which is helped by the fact that ATI’s card saw virtually no improvement whatsoever with the new driver. So how about the other games we mentioned on the front-page? Those are up next.
When it comes to first-person shooters, post-apocalyptic adventures are a dime a dozen. But when S.T.A.L.K.E.R. was first released in the spring of 2007, it dared to be different. How? By basing the game off of a real-world tragedy, the Chernobyl nuclear disaster, which occurred way back in 1986 near the city of Prypiat in the Ukraine. Despite the disaster happening so long ago, people are still unable to live in the surrounding area, and will be unable to for at least another 150 years.
In addition to the games real-world ties, S.T.A.L.K.E.R. happened to be one of the grittiest, realistic (aside from the problematic AI) and expansive games we’ve seen on the PC in a while. Having the ability to roam as you like is a huge benefit and really helped make the game feel real. Clear Sky further delivers on what made the original so great, but at the same time, adds support for DX10.
It might be difficult to judge from the screenshot, but Clear Sky (like the original) is one of the most demanding games on the PC today, especially if you wish to play using DX10. To help push all of our GPUs to their breaking-point, we stick to that mode while using the “High” quality setting.
Although I believe Clear Sky to be one of the most unoptimized games available (it’s a problem if it runs like frozen tar at virtually any resolution with DX10 mode), ATI’s badge belongs on the box. NVIDIA took the reign on most of the tests here, but ATI deserves this one.
As odd as it may seem, every single game we currently use for our graphic card benchmarking is a sequel or an entry in a series of games, including this one. The original Unreal Tournament launched in late 1999, and since then, it has become a stature with GPU benchmarking. Similar to Call of Duty, the UT series of games is one that manages to deliver spectacular graphics, but doesn’t require a bleeding-edge machine to see them.
UTIII offers a variety of modes and levels, and has some of the most interesting and lush environments ever seen in a video game. If I could choose where I wanted to die, it would most likely be in the Gateway level, which you can see in the screenshot below. This level is one of the most interesting in the game as it’s essentially three levels in one, linked together with portals – and it’s hard to beat the feeling of scoring a portal frag.
The game might be one of the best-looking currently on the PC, but it doesn’t offer robust in-game settings like some others in our suite. Because of this, we are forced to enable anti-aliasing in the control panel of the current graphics card. Both ATI’s and NVIDIA’s drivers allow us to choose 4xAA, so that’s what we stick with throughout all of our testing.
Here, we saw a similar effect as we did with Fallout 3. ATI’s card didn’t come out on top early on, but at 2560×1600, the winner is clear. Also like Fallout 3, these results struck me odd as well, simply because I wasn’t expecting it, but the game was retested on both configurations, and the results stuck.
Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark Vantage is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.
The company first started out as MadOnion and released a GPU-benchmarking tool called XLR8R, which was soon replaced with 3DMark 99. Since that time, we’ve seen seven different versions of the software, including two major updates (3DMark 99 Max, 3DMark 2001 SE). With each new release, the graphics get better, the capabilities get better and the sudden hit of ambition to get down and dirty with overclocking comes at you fast.
Similar to a real game, 3DMark Vantage offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, High and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test.
Performance is the stock mode that most use when benchmarking, but it only uses a resolution of 1280×1024, which isn’t representative of today’s gamers. Extreme is more appropriate, as it runs at 1920×1200 and does well to push any single or multi-GPU configuration currently on the market – and will do so for some time to come.
Although ATI’s drivers did manage to help the performance a little bit, 3DMark shows NVIDIA’s card to be at the top, especially as the resolutions increase.
Since we’ve had so many graphs already, the power consumption and temperature results are being condensed into this last one. For power consumption, we use a Kill-a-Watt that has nothing but the PC plugged in, while for GPU temperature tracking, we use GPU-Z, which does a fantastic job of exporting various bits of information to a clean text file.
To begin, the PC is left turned off for at least five minutes, and then is boot up and left to sit at the Windows’ desktop for another five minutes. At that point, both the idle wattage and temperature are recorded. To stress the GPU for load information, 3DMark Vantage’s “Extreme” test is executed at 2560×1600. The space flight test is used exclusively here in a loop of three, with the results being recorded during a specific sequence during the run where it seems to stress the GPU the most.
Overall, it seems NVIDIA’s card doesn’t just excel in performance, but in both power consumption and temperatures as well. It loses at full load, compared to the HD 4870 1GB, but given the performance has generally scaled beyond that 2%, it seems fair. Not to mention that the idle wattage (where the PC will likely sit most of the time) is 14.5W lower.
In the last article, we took a look at current pricing, because in the end, that alone can make a world of difference. At that time, NVIDIA’s GTX 260/216 proved to be $30 less expensive on average, even before a mail-in rebate. So let’s take a look at current figures:
(Prices as of Dec 8, 2008)
Since our last look at these two cards, the landscape changed dramatically. While NVIDIA was clearly the least-expensive offering just two weeks ago, that’s no longer the case. On ATI’s side, the current median is $263.32, while NVIDIA’s is $283.32. If main-in rebates are taken into consideration, ATI’s revised median becomes $252.49, while NVIDIA’s is oddly enough identical to ATI’s without the rebates, at $263.32.
It’s rather incredible what stark changes are seen here. Despite the fact that NVIDIA’s offering was $30 less expensive on average as of two weeks ago, they’re now $20 more expensive. That’s an adjustment of $50! It’s as though the prices hiked as soon as we published just how great of a card the GTX 260/216 was. Whew, I didn’t know we had that kind of power!
When writing a head-to-head article and we see one of the models dominating the other, it’s not so difficult to come up with a conclusion. That was the case with our article two weeks ago. NVIDIA’s offering proved to be faster, more power efficient, and cheaper. It’s impossible to misinterpret that. NVIDIA clearly had the better card. It’s simply something that couldn’t be argued.
But, things have changed so much since then, that the decision of which card is better is far more difficult. Performance aside, NVIDIA’s card is still more efficient overall. It draws less power, and runs cooler. Those are two nice pluses if you are looking to be energy efficient and want your PC to run as cool as possible, while still packing the power you crave.
From the performance aspect, ATI made some nice improvements with their 8.12 driver, and they’re reflected here. Although the HD 4870 1GB didn’t manage to overtake NVIDIA in every test, they did catch up in some, and even overtook in a couple. Those couple being Crysis Warhead and Fallout 3. We also saw some great performance in non-holiday titles as well, which included S.T.A.L.K.E.R.: Clear Sky and Unreal Tournament III. In fact, we found that ATI performed better on all the non-holiday titles used here.
To fine-tune our results a little bit more, let’s take our highest-available resolution for each game, 2560×1600. This resolution is the most intensive available right now, and as a result, it’s a good one to see where one GPU excels over another. If we take a look at performance data at that resolution for each game (1920×1200 for Dead Space), we see that ATI performed the best in four titles (Crysis Warhead, Fallout 3, Clear Sky and UT III), while NVIDIA lead the other four (Call of Duty: WaW, Dead Space, Far Cry 2 and Left 4 Dead). From that standpoint, the cards appear to be almost identical, each having their own set of games in which they excel.
That wouldn’t have been the same case two weeks ago, since ATI’s updated driver did improve performance in certain games, especially Fallout 3 and Crysis. But thanks to the driver, and the price fluctuation, ATI’s card is now a much more attractive offering. Why NVIDIA’s card raised in price is pretty obvious. NVIDIA had many editors take a look at the performance of their card, and when it became public knowledge just how great it was, the prices were jacked, while ATI’s were lowered.
The conclusion? There is no conclusion. Given the pricing information above, I think both cards come out equal. ATI’s card costs $20 less, but isn’t quite as powerful as NVIDIA’s card in certain games (most notably, Call of Duty: World at War). On the other hand, NVIDIA’s card costs $20 more, but it runs a bit cooler, is more power efficient, and supports PhysX, which may be a big thing next year. It’s really difficult to conclude on this one, so it’s a matter of choosing what’s more important, money saved now, or the certain perks that NVIDIA’s card carries (namely PhysX). The good thing? It’s difficult to go wrong with either.
Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!
Copyright © 2005-2020 Techgage Networks Inc. - All Rights Reserved.