Date: April 21, 2010
Author(s): Rob Williams
NVIDIA might have hoped for otherwise, but even after the GTX 480’s launch, AMD’s Radeon HD 5870 still proves to be an excellent choice for the price-point. We’re taking a look at PowerColor’s PCS+ version here, which includes a robust cooler, quieter operation, a slight overclock, and a complete copy of Call of Duty: Modern Warfare 2.
Late last month, NVIDIA unveiled its long-awaited GeForce GTX 480 graphics card. At the same time, it answered a lot of our questions, primarily the one of how it would compare against ATI’s highest-end offering, the Radeon HD 5870. As we discovered, while the GTX 480 does indeed out-perform the HD 5870 in most tests (~5 – 10%), the HD 5870 was arguably still the more attractive offering.
It seems odd to state such a thing, but the process of deciding between both cards boiled down to a couple of things. The most important was the price. As the GTX 480 is priced at $100 above the HD 5870, it’s a tough sell for a 5 – 10% performance bump. To make the decision even harder, NVIDIA’s launch cards suffer horrible temperatures, loud fan noises and draws far more power from the socket than AMD’s offering.
From a personal standpoint, I sided with AMD’s card at the end of that article, because it beat the GTX 480 in almost everything aside from the raw performance. Where the GTX 480 also shines is with PhysX and 3D Vision support, which AMD’s cards obviously do not. These two things aren’t deal-breakers for me, but they are for many. Two personal friends of mine opted for NVIDIA’s latest offerings (GTX 470 and GTX 480) primarily because of the PhysX support. So, NVIDIA’s certainly doing something right there.
Right before the launch of the GTX 480, AMD itself knew that it had nothing to worry about, because it sent us a card via PowerColor that aims to reaffirm AMD’s excellent performance/$ ratio. The card in particular we’re looking at is the PCS+ version, which uses a special cooler, features slightly higher clocks and also includes a full copy of Call of Duty: Modern Warfare 2.
By this point in time, you’re all likely very familiar with the Radeon HD 5870. PowerColor’s model does little more than up the clocks (850MHz > 875MHz Core and 1200MHz > 1225MHz Memory) and swaps out the reference cooler for another that’s a bit quieter and more efficient. And of course as mentioned above, this particular version also currently includes a free copy of Modern Warfare 2.
|Radeon HD 5970|
1600 x 2
|Radeon HD 5870 Eyefinity 6|
|Radeon HD 5870|
|Radeon HD 5850|
|Radeon HD 5830|
|Radeon HD 5770|
|Radeon HD 5750|
512MB – 1GB
|Radeon HD 5670|
512MB – 1GB
|Radeon HD 5570|
512MB – 1GB
|Radeon HD 5450|
512MB – 1GB
As it stands, the HD 5870 is still AMD’s highest-end single-GPU offering, with the dual-GPU HD 5970 sitting above it. For those craving a lot of screen real-estate, there’s the Eyefinity 6 version of the HD 5870 available as well. We’re in the process of preparing an in-depth look at that card and the six-display configuration, but are running far behind. Please stay tuned though as the article will be well worth it!
Below you can see PowerColor’s special cooler design. The shroud is a light plastic and allows a lot of air to make its way through. The fan is also large, and very quiet. As you can see, there are heatpipes extruding through the back, somewhat similar to NVIDIA’s GeForce GTX 480 design.
As you’d expect, the back of PowerColor’s card features the same video outputs we’ve come to expect from AMD’s HD 5000 series. This includes 2 x DVI, 1 x DisplayPort and 1 x HDMI. If you wanted, you could hook up four displays from this single card. In the same photo, you can see the CrossFireX bridge.
Here you can get a better view of the heatpipe design, and for the most part, it’s not far different than many other designed on the market. The heatsink itself is finned in its entirety with the heatpipes coming straight from above the GPU core itself. The design seems simple, but it’s efficient.
That about wraps up the look at the card, but as a reminder, PowerColor is currently bundling in Modern Warfare 2 with this PCS+ version, so if that’s a game you’ve been on the lookout for, you get a good deal when picking up this particular model.
At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing. For an exhaustive look at our methodologies, even down to the Windows Vista installation, please refer to this article.
The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used. Each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, it will bring you to the product on the manufacturer’s website.
Intel Core i7-975 Extreme Edition – Quad-Core, 3.33GHz, 1.33v
Gigabyte GA-EX58-EXTREME – X58-based, F7 BIOS (05/11/09)
Corsair DOMINATOR – DDR3-1333 7-7-7-24-1T, 1.60
|ATI Graphics|| Radeon HD 5870 1GB (PowerColor PCS+) – Catalyst 10.3|
Radeon HD 5870 1GB (Reference) – Catalyst 10.3
Radeon HD 5850 1GB (Sapphire Toxic) – Catalyst 10.2
Radeon HD 5850 1GB (ASUS) – Catalyst 9.10
Radeon HD 5830 1GB (Reference) – Beta Catalyst (02/10/10)
Radeon HD 5770 1GB (Reference) – Beta Catalyst (10/06/09)
Radeon HD 5750 1GB (Sapphire) – Catalyst 9.11
Radeon HD 5670 512MB (Reference) – Beta Catalyst (12/16/09)
Radeon HD 5570 1GB (Sapphire) – Beta Catalyst (12/11/09)
|NVIDIA Graphics|| GeForce GTX 480 1536MB (Reference) – GeForce 197.17|
GeForce GTX 295 1792MB (Reference) – GeForce 186.18
GeForce GTX 285 1GB (EVGA) – GeForce 186.18
GeForce GTX 275 896MB (Reference) – GeForce 186.18
GeForce GTX 260 896MB (XFX) – GeForce 186.18
GeForce GTS 250 1GB (EVGA) – GeForce 186.18
GeForce GT 240 512MB (ASUS) – GeForce 196.21
When preparing our testbeds for any type of performance testing, we follow these guidelines:
To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows Vista from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.
For more robust information on how we tweak Windows, please refer once again to this article.
At this time, we currently benchmark all of our games using three popular resolutions: 1680×1050, 1920×1080 and also 2560×1600. 1680×1050 was chosen as it’s one of the most popular resolutions for gamers sporting ~20″ displays. 1920×1080 might stand out, since we’ve always used 1920×1200 in the past, but we didn’t make this change without some serious thought. After taking a look at the current landscape for desktop monitors around ~24″, we noticed that 1920×1200 is definitely on the way out, as more and more models are coming out as native 1080p. It’s for this reason that we chose it. Finally, for high-end gamers, we also benchmark using 2560×1600, a resolution that’s just about 2x 1080p.
For graphics cards that include less than 1GB of GDDR, we omit Grand Theft Auto IV from our testing, as our chosen detail settings require at least 800MB of available graphics memory. Also, if the card we’re benchmarking doesn’t offer the performance to handle 2560×1600 across most of our titles reliably, only 1680×1050 and 1920×1080 will be utilized.
Because we value results generated by real-world testing, we don’t utilize timedemos whatsoever. The possible exception might be Futuremark’s 3DMark Vantage. Though it’s not a game, it essentially acts as a robust timedemo. We choose to use it as it’s a standard where GPU reviews are concerned, and we don’t want to rid our readers of results they expect to see.
All of our results are captured with the help of Beepa’s FRAPS 2.98, while stress-testing and temperature-monitoring is handled by OCCT 3.1.0 and GPU-Z, respectively.
When the original Call of Duty game launched in 2003, Infinity Ward was an unknown. Naturally… it was the company’s first title. But since then, the series and company alike have become household names. Not only has the series delivered consistently incredible gameplay, it’s pushed the graphics envelope with each successive release, and where Modern Warfare is concerned, it’s also had a rich storyline.
The first two titles might have been built on the already-outdated Quake III engine, but since then, the games have been built with improved graphical features, capable of pushing the highest-end PCs out there. Modern Warfare 2 is the first such exception, as it’s more of a console port than a true PC title. Therefore, the game doesn’t push PC hardware as much as we’d like to see, but despite that, it still looks great, and lacks little in the graphics department. You can read our review of the game here.
Manual Run-through: The level chosen is the 10th mission in the game, “The Gulag”. Our teams fly in helicopters up to an old prison with the intention of getting closer to finding the game’s villain, Vladimir Makarov. Our saved game file begins us at the point when the level name comes on the screen, right before we reach the prison, and it ends after one minute of landing, following the normal progression of the level. The entire run takes around two-and-a-half minutes.
Because PowerColor’s card isn’t exactly overclocked that far beyond the reference, we don’t expect to see major gains throughout our results, and also for that reason, we’ve decided to fore go testing our “Best Playable” as it would be identical to our original HD 5870 results.
Thanks to its minor overclock, PowerColor’s card managed to surpass the GTX 480 by just a smidgen. In the real-world, though, the entire top-rung of cards (GTX 275+ and HD 5850+) all offer a very good experience even at maxed detail settings.
When the original Call of Juarez was released, it brought forth something unique… a western-styled first-person shooter. That’s simply not something we see too often, so for fans of the genre, its release was a real treat. Although it didn’t really offer the best gameplay we’ve seen from a recent FPS title, its storyline and unique style made it well-worth testing.
After we retired the original title from our suite, we anxiously awaited for the sequel, Bound in Blood, in hopes that the series could be re-introduced into our testing once again. Thankfully, it could, thanks in part to its fantastic graphics, which are based around the Chrome Engine 4, and improved gameplay of the original. It was also well-received by game reviewers, which is always a good sign.
Manual Run-through: The level chosen here is Chapter I, and our starting point is about 15 minutes into the mission, where we stand atop a hill that overlooks a large river. We make our way across the hill and ultimately through a large trench, and we stop our benchmarking run shortly after we blow up a gas-filled barrel.
As we saw with the HD 5870 in our GTX 480 review, AMD really holds the crown where Bound in Blood is concerned, and things only improve if you boost the clocks even a little bit.
Like Call of Duty, Crysis is another series that doesn’t need much of an introduction. Thanks to the fact that almost any comments section for a PC performance-related article asks, “Can it run Crysis?”, even those who don’t play computer games no doubt know what Crysis is. When Crytek first released Far Cry, it delivered an incredible game engine with huge capabilities, and Crysis simply took things to the next level.
Although the sequel, Warhead, has been available for just about a year, it still manages to push the highest-end systems to their breaking-point. It wasn’t until this past January that we finally found a graphics solution to handle the game at 2560×1600 at its Enthusiast level, but even that was without AA! Something tells me Crysis will be de facto for GPU benchmarking for the next while.
Manual Run-through: Whenever we have a new game in-hand for benchmarking, we make every attempt to explore each level of the game to find out which is the most brutal towards our hardware. Ironically, after spending hours exploring this game’s levels, we found the first level in the game, “Ambush”, to be the hardest on the GPU, so we stuck with it for our testing. Our run starts from the beginning of the level and stops shortly after we reach the first bridge.
Crysis Warhead runs like a slug on most any hardware, and despite the higher clocks of the PCS+, there is virtually no difference. With all cards, with the possible exception of the GTX 295, you’ll likely want to downgrade the detail to Mainstream to have truly playable framerates here.
Five out of the seven current games we use for testing are either sequels, or titles in an established series. F.E.A.R. 2 is one of the former, following up on the very popular First Encounter Assault Recon, released in fall of 2005. This horror-based first-person shooter brought to the table fantastic graphics, ultra-smooth gameplay, the ability to blow massive chunks out of anything, and also a very fun multi-player mode.
Three-and-a-half years later, we saw the introduction of the game’s sequel, Project Origin. As we had hoped, this title improved on the original where gameplay and graphics were concerned, and it was a no-brainer to want to begin including it in our testing. The game is gorgeous, and there’s much destruction to be had (who doesn’t love blowing expensive vases to pieces?). The game is also rather heavily scripted, which aides in producing repeatable results in our benchmarking.
Manual Run-through: The level used for our testing here is the first in the game, about ten minutes in. The scene begins with a travel up an elevator, with a robust city landscape behind us. Our run-through begins with a quick look at this cityscape, and then we proceed through the level until the point when we reach the far door as seen in the above screenshot.
The trend continues here with another good showing. AMD once again beat NVIDIA, although since we’re at a point where mainstream and higher cards all offer great performance, it’s hard to say that you’ll get a better experience on either the GTX 480 or HD 5870.
If you look up the definition for “controversy”, Grand Theft Auto should be listed. If it’s not, then that should be a crime, because throughout GTA’s many titles, there’s been more of that than you can shake your fist at. At the series’ beginning, the games were rather simple, and didn’t stir up too much passion in certain opposers. But once GTA III and its successors came along, its developers enjoyed all the controversy that came their way, and why not? It helped spur incredible sales numbers.
Grand Theft Auto IV is yet another continuation in the series, though it follows no storyline from the previous titles. Liberty City, loosely based off of New York City, is absolutely huge, with much to explore. This is so much so the case, that you could literally spend hours just wandering around, ignoring the game’s missions, if you wanted to. It also happens to be incredibly stressful on today’s computer hardware, similar to Crysis.
Manual Run-through: After the first minor mission in the game, you reach an apartment. Our benchmarking run starts from within this room. From here, we run out the door, down the stairs and into an awaiting car. We then follow a specific path through the city, driving for about three minutes total.
Like Crysis Warhead, the slight boost in GPU clocks doesn’t make much of a difference with GTA IV, simply because it’s such a hardcore glutton of a game if there ever was one.
If you primarily play games on a console, your choices for quality racing games are plenty. On the PC, that’s not so much the case. While there are a good number, there aren’t enough for a given type of racing game, from sim, to arcade. So when Race Driver: GRID first saw its release, many gamers were excited, and for good reason. It’s not a sim in the truest sense of the word, but it’s certainly not arcade, either. It’s somewhere in between.
The game happens to be great fun, though, and similar to console games like Project Gotham Racing, you need a lot of skill to succeed at the game’s default difficulty level. And like most great racing games, GRID happens to look absolutely stellar, and each of the game’s locations look very similar to their real-world counterparts. All in all, no racing fan should ignore this one.
Manual Run-through: For our testing here, we choose the city where both Snoop Dogg and Sublime hit their fame, the LBC, also known as Long Beach City. We choose this level because it’s not overly difficult, and also because it’s simply nice to look at. Our run consists of an entire 2-lap race, with the cars behind us for almost the entire race.
Similar to Bound in Blood, GRID is a game that can benefit from even minor clock boosts, and that’s seen here for the most part. The difference is small, and rather expected.
I admit that I’m not a huge fan of RTS titles, but World in Conflict intrigued me from the get go. After all, so many war-based games continue to follow the same story-lines we already know, and WiC was different. It counteracts the fall of the political and economic situation in the Soviet Union in the late 80’s, and instead provides a storyline that follows it as if the USSR had succeeded by proceeding with war in order to remain in power.
Many RTS games, with their advanced AI, tend to favor the CPU in order to deliver smooth gameplay, but WiC favors both the CPU and GPU, and the graphics prove it. Throughout the game’s missions, you’ll see gorgeous vistas and explore areas from deserts and snow-packed lands, to fields and cities. Overall, it’s a real visual treat for the eyes – especially since you’re able to zoom to the ground and see the action up-close.
Manual Run-through: The level we use for testing is the 7th campaign of the game, called Insurgents. Our saved game plants us towards the beginning of the mission with two squads of five, and two snipers. The run consists of bringing our men to action, and hovering the camera around throughout the duration. The entire run lasts between three and four minutes.
Up to this point, the GTX 480 hasn’t been able to definitively prove that it’s the better card when compared to the HD 5870, but Soviet Assault does a great job in helping its case. At all three resolutions, the GTX 480 clearly handles the game better than ATI’s single-GPU best.
Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark Vantage is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.
The company first started out as MadOnion and released a GPU-benchmarking tool called XLR8R, which was soon replaced with 3DMark 99. Since that time, we’ve seen seven different versions of the software, including two major updates (3DMark 99 Max, 3DMark 2001 SE). With each new release, the graphics get better, the capabilities get better and the sudden hit of ambition to get down and dirty with overclocking comes at you fast.
Similar to a real game, 3DMark Vantage offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, High and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test.
Performance is the stock mode that most use when benchmarking, but it only uses a resolution of 1280×1024, which isn’t representative of today’s gamers. Extreme is more appropriate, as it runs at 1920×1200 and does well to push any single or multi-GPU configuration currently on the market – and will do so for some time to come.
For our last test, the PCS+ matches the reference card almost exactly. It’s results are slightly behind the reference at 2560×1600, but that’s nothing to take too much credence in. For all intents and purposes, the performance is the same.
Before tackling our overclocking results, let’s first clear up what we consider to be a real overclock and how we go about achieving it. If you read our processor reviews, you might already be aware that we don’t care too much for an unstable overclock. It might look good on paper, but if it’s not stable, then it won’t be used. Very few people purchase a new GPU for the sole purpose of finding the maximum overclock, which is why we focus on finding what’s stable and usable.
To find the max stable overclock on an ATI card, we stick to using ATI’s Catalyst Overdrive tool. Compared to what’s available on the NVIDIA side, it’s quite limited in the top-end, but it’s the most robust and simplest solution to use. For NVIDIA, we use EVGA’s Precision, which allows us to reach heights that are in no way sane – a good thing.
Once we find what we believe might be a stable overclock, the card is put through 30 minutes of torture with the help of OCCT 3.0’s GPU stress-test, which we find to push any graphics card harder than any other stress-tester we’ve ever used. If the card passes there, we then further verify by running the card through a 2x run of 3DMark Vantage’s Extreme setting. Finally, games are quickly loaded and tested out to assure we haven’t introduced any side-effects.
If all these tests pass without issue, we consider the overclock to be stable.
The reference clocks for the HD 5870 are once again 850MHz for the Core and 1200MHz for the memory. On the PCS+ card, those clocks are boosted to 875MHz for the Core and 1225MHz for the memory. The latter of the two is without question the hardest to increase further, as GDDR is simply finicky at times, especially without the ability to adjust voltages.
I tried to hit 1300MHz on the Memory, but it just wasn’t going to happen. 1275MHz did, however, and proved to be completely stable. For the Core, I was able to hit 940MHz, which is 65MHz above the PCS+ standard and 90MHz above reference. Going higher on either the Core or Memory, even when one or the other wasn’t overclocked at all, didn’t prove fruitful. So in the end, I stuck with 940MHz/1275MHz. Can we see a benefit?
As usual, the differences seen from overclocking are minimal at best, but if you believe 3DMark Vantage, there is some nice gain to be seen, with almost a 1,000 point increase in the Extreme test. Again, I never recommend overclocking a GPU given the results are typically minimal, but there is some improvement if you wish to take that route.
To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.
As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.
To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.
Please note that both the reference card listed above and the one highlighted are the same card (we ran into an issue with our real reference card), so the temperates aren’t as important as we’d like them to be. However, you can see the difference in temperature compare to say the HD 5850. The PCS+ is quite a bit faster, but barely any hotter.
Can the HD 5870, being a much faster card than the GTX 275, draw less power? The proof is in the pudding!
One thing is for certain… it’s hard to write a conclusion for a graphics card that’s not much different than a model you’ve taken a look at before. Despite the fact that I took a look at the reference HD 5870 last fall, I still couldn’t help but be interested in taking PowerColor’s PCS+ for a spin, and it’s mainly for the reasons that I’ve mentioned before, over and over.
AMD has a great thing going with the HD 5000 series, and in some regards, NVIDIA seems to be light-years behind the company in terms of power consumption and temperatures… two things that happen to mean a lot to me (few people will opt for a GPU that draws an additional 100W, after all). The room I do all my testing in is less-than-ideal temperature-wise, so when I use hardware that’s overly hot, it’s impossible to ignore.
Like the GTX 480, the HD 5000 series as a whole fully supports DirectX 11 and Eyefinity, and the main lacking feature compared to NVIDIA’s offerings is PhysX, which is understandable as NVIDIA doesn’t license it out to AMD (it claims that it could or would, but nothing’s certain).
Personally, I like PhysX, but it’s not something I’d weigh too heavily in a purchasing decision. I have a couple of friends as mentioned earlier who are the opposite, and despite the heat and power, both opted for NVIDIA’s latest offerings just to secure that support.
I still love the HD 5870, and if I had to purchase a card today, I’d have no hesitation in picking up this model. It’s a tad expensive, at $400, but at that point, it’s still $100 less expensive than the GTX 480, and offers nearly the same performance (the GTX 480 is ~5 – 10% faster). But for me, I appreciate AMD’s extremely effective power efficiency and lower temperatures. You may weigh these two factors far different than me, however.
If you opt for PowerColor’s PCS+ version, you’ll be paying +$20 more than most of the others. Is it worth it? I’d have to say it is, as long as you A) appreciate a slightly quieter cooler and improved temperatures and B) want a copy of Modern Warfare 2. Sapphire’s HD 5870 Vapor-X costs $20 more than the PCS+, so overall, PowerColor’s card looks to be quite a good deal when all aspects are taken into consideration.
Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!
Copyright © 2005-2019 Techgage Networks Inc. - All Rights Reserved.