Techgage logo

ATI’s Radeon HD 5670 – DirectX 11 for $100

Date: January 15, 2010
Author(s): Rob Williams

AMD has delivered a couple of firsts over the past few months, and it’s keeping the tradition going with its release of the market’s first $100 DirectX 11-capable graphics card. Despite its budget status, the HD 5670 retains the HD 5000-series’ impressive power consumption and low idle temperatures, along with AMD’s Eyefinity support.



Introduction

Over the course of the past four months, AMD has hit a couple of firsts. On the CPU front, it delivered the industry’s first $100 quad-core, the Athlon II X4 620, and shortly before that, it released the market’s first DirectX 11 graphics cards. Today, the company once again succeeds in hitting another first, bringing the first $100 DirectX 11-capable GPU to market. This comes in the form of the modest-looking Radeon HD 5670.

With a “$99” price tag for the 512MB model, the HD 5670 is a bit of an odd beast. The reason lies with the HD 5750, which has a 512MB version which retails for just $10 more. At least, according to AMD’s suggested retail price. In the real-world, the pricing is more towards ~$130 for the 512MB, and ~$145 for the 1GB. With that being the case, a $99 offering (which actually is the current going price), makes much more sense. It’s still unfortunate that no e-tailer seems to be selling the step-up for its SRP, but at the same time, it’s no surprise.

I shouldn’t get too far ahead of myself, though, because a sub-$100 DirectX 11 graphics card is something that should be ushered in with a cheer. NVIDIA has yet to deliver a single DX11 offering, but AMD continues to execute its full-fledged release cycle without skipping a beat. If things go according to plan, AMD will launch its entire gamut of DX11 cards a month or two before NVIDIA’s Fermi hits the street, which is estimated to happen in March.

The HD 5670’s goals are what you’d expect for a $100 card (let’s not kid ourselves, $99.99 is $100). It’s designed to offer mainstream consumers a quality graphics offering that can handle all of today’s games at decent resolutions, while remaining power-efficient, and quiet. With what’s essentially 1/4th of an HD 5870, and a power rating of 14W/61W (idle/load), I think it’s safe to say that AMD’s latest card has as good a chance as any of hitting this mark.

ATI Radeon HD 5670 512MB and 1GB

Above is a shot of AMD’s reference design, and also Sapphire’s 1GB model which includes an Arctic Cooling GPU cooler, which as we’ll find out later, is an incredible upgrade over what AMD slaps on there. Like its older siblings, the HD 5670 lacks nothing when it comes to its overall feature-set, so in addition to DirectX 11 support, we also have the ability to take advantage of Eyefinity, AMD’s robust multi-monitor technology.

Speaking of, one thing AMD wants us all to know is that support for Eyefinity will only be increasing in 2010. In 2009, supported titles were Dirt 2, S.T.A.L.K.E.R.: Call of Pripyat (still unreleased in North America), Battle Forge, Dragon Age: Origins and Tom Clancy’s H.A.W.X. In 2010, we’re due to see support for Alien vs. Predator, Battlefield: Bad Company 2, Command & Conquer 4, Dawn of War, Dungeons & Dragons Online, Lord of the Rings Online, Mass Effect 2, Modern Warfare 2 and Supreme Commander 2. If I had to guess, this is just the tip of the iceberg, as multi-monitor gaming is most certainly bound to take off soon enough.

Model
Core MHz
Mem MHz
Memory
Bus Width
Processors
Radeon HD 5970
725
1000
2048MB
256-bit
1600 x 2
Radeon HD 5870
850
1200
1024MB
256-bit
1600
Radeon HD 5850
725
1000
1024MB
256-bit
1440
Radeon HD 5770
850
1200
1024MB
128-bit
800
Radeon HD 5750
700
1150
512 – 1024MB
128-bit
720
Radeon HD 5670
775
1000
512 – 1024MB
128-bit
400
Radeon HD 5500
???
???
???
???
???
Radeon HD 5450
???
???
???
???
???

As mentioned earlier, the HD 5670 is about 1/4th specs-wise of the HD 5870, with the lone difference being that cutting back of the wide memory bus, down to 128-bit. In addition, because this card requires so little power, no PCI-E power connector is required. You might notice a lot of confusion at the bottom of the chart, and that’s because AMD unveiled its plans to release the HD 5500 series and HD 5450 card within the next month, but neglected to give us some specifications. We’ll just have to wait.

Most comparable to the HD 5670 that we’re looking at today is NVIDIA’s GeForce GT 240 which costs about the same, but closer to around $90 for most models if you play your “cards” right. Unfortunately, we haven’t benchmarked the GT 240 up to this point, but from talking to vendors, we can expect that the HD 5670 will surpass it in performance in all tests, but not so starkly that it will become the definitive winner. Still, in addition to faster performance, the HD 5670 also adds features such as far improved power consumption and DirectX 11, plus Eyefinity if that’s a possibility down the road for you.

Sapphire Radeon HD 5670 1GB - Up-Close Fan

In the top photo for this article, you can see the rather simplistic cooler that AMD has equipped its reference card. The design is understandable, as this is not a high-end offering, and slim is definitely going to be the chosen path for those who want to use the card for HTPC use. But for those who don’t mind an extra little room being taken up, Sapphire’s cooling solution is more ideal, as it retains its quiet operation, and cools much, much better.

ATI Radeon HD 5670 Display ConnectorsB

Both of the cards we received for testing have HDMI, DisplayPort and DVI dual-link ports, and with the included DVI-to-VGA connector, no matter what your monitor wants to take, you’ll be good to go, right out of the box. In addition, should you want to use Eyefinity, Sapphire also includes an HDMI to DVI cable, so that you will essentially have two DVI ports and one DisplayPort at your perusal.

We’ve covered pretty-well everything about the HD 5670, so let’s move into our test results… right after a quick peek at our test machine and methodologies.

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing. For an exhaustive look at our methodologies, even down to the Windows Vista installation, please refer to this article.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used. Each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, it will bring you to the product on the manufacturer’s website.

Component
Model
Processor
Intel Core i7-975 Extreme Edition – Quad-Core, 3.33GHz, 1.33v
Motherboard
Gigabyte GA-EX58-EXTREME – X58-based, F7 BIOS (05/11/09)
Memory
Corsair DOMINATOR – DDR3-1333 7-7-7-24-1T, 1.60
ATI Graphics Radeon HD 5870 1GB (Sapphire) – Catalyst 9.10
Radeon HD 5850 1GB (ASUS) – Catalyst 9.10
Radeon HD 5770 1GB (Sapphire Vapor-X) – Catalyst 9.11
Radeon HD 5770 1GB (Reference) – Beta Catalyst (10/06/09)
Radeon HD 5670 1GB (Sapphire) – Beta Catalyst (12/16/09)

Radeon HD 5670 512MB (Reference) – Beta Catalyst (12/16/09)

Radeon HD 4890 1GB (Sapphire) – Catalyst 9.8
Radeon HD 4870 1GB (Reference) – Catalyst 9.8
Radeon HD 4770 512MB (Gigabyte) – Catalyst 9.8
NVIDIA Graphics GeForce GTX 295 1792MB (Reference) – GeForce 186.18
GeForce GTX 285 1GB (EVGA) – GeForce 186.18
GeForce GTX 275 896MB (Reference) – GeForce 186.18
GeForce GTX 260 896MB (XFX) – GeForce 186.18
GeForce GTS 250 1GB (EVGA) – GeForce 186.18
Audio
On-Board Audio
Storage
Seagate Barracuda 500GB 7200.11
Power Supply
Corsair HX1000W
Chassis
SilverStone TJ10 Full-Tower
Display
Gateway XHD3000 30″
Cooling
Thermalright TRUE Black 120
Et cetera
Windows Vista Ultimate 64-bit

When preparing our testbeds for any type of performance testing, we follow these guidelines:

To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows Vista from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

For more robust information on how we tweak Windows, please refer once again to this article.

Game Titles

At this time, we currently benchmark all of our games using three popular resolutions: 1680×1050, 1920×1080 and also 2560×1600. 1680×1050 was chosen as it’s one of the most popular resolutions for gamers sporting ~20″ displays. 1920×1080 might stand out, since we’ve always used 1920×1200 in the past, but we didn’t make this change without some serious thought. After taking a look at the current landscape for desktop monitors around ~24″, we noticed that 1920×1200 is definitely on the way out, as more and more models are coming out as native 1080p. It’s for this reason that we chose it. Finally, for high-end gamers, we also benchmark using 2560×1600, a resolution that’s just about 2x 1080p.

For graphics cards that include less than 1GB of GDDR, we omit Grand Theft Auto IV from our testing, as our chosen detail settings require at least 800MB of available graphics memory. Also, if the card we’re benchmarking doesn’t offer the performance to handle 2560×1600 across most of our titles reliably, only 1680×1050 and 1920×1080 will be utilized.

Because we value results generated by real-world testing, we don’t utilize timedemos whatsoever. The possible exception might be Futuremark’s 3DMark Vantage. Though it’s not a game, it essentially acts as a robust timedemo. We choose to use it as it’s a standard where GPU reviews are concerned, and we don’t want to rid our readers of results they expect to see.

All of our results are captured with the help of Beepa’s FRAPS 2.98, while stress-testing and temperature-monitoring is handled by OCCT 3.1.0 and GPU-Z, respectively.

Call of Duty: Modern Warfare 2

Call of Juarez: Bound in Blood

Crysis Warhead

F.E.A.R. 2: Project Origin

Race Driver: GRID

World in Conflict: Soviet Assault

Call of Duty: Modern Warfare 2

When the original Call of Duty game launched in 2003, Infinity Ward was an unknown. Naturally… it was the company’s first title. But since then, the series and company alike have become household names. Not only has the series delivered consistently incredible gameplay, it’s pushed the graphics envelope with each successive release, and where Modern Warfare is concerned, it’s also had a rich storyline.

The first two titles might have been built on the already-outdated Quake III engine, but since then, the games have been built with improved graphical features, capable of pushing the highest-end PCs out there. Modern Warfare 2 is the first such exception, as it’s more of a console port than a true PC title. Therefore, the game doesn’t push PC hardware as much as we’d like to see, but despite that, it still looks great, and lacks little in the graphics department. You can read our review of the game here.

Manual Run-through: The level chosen is the 10th mission in the game, “The Gulag”. Our teams fly in helicopters up to an old prison with the intention of getting closer to finding the game’s villain, Vladimir Makarov. Our saved game file begins us at the point when the level name comes on the screen, right before we reach the prison, and it ends after one minute of landing, following the normal progression of the level. The entire run takes around two-and-a-half minutes.

Throughout all of our results, it will be a common theme to see the HD 5670 place in last, because from a specs standpoint, it’s without question the slowest card. That doesn’t make it a bad offering, though, as it’s also the least-expensive. Given that the HD 5750 512MB retails for $130 on most e-tailers, the performance of both the HD 5670’s performed quite well. Not-so-surprisingly, the 1GB model pulled ahead just a wee bit.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5770 1GB CrossFireX
2560×1600 – Max Detail, 4xAA
40
81.311
ATI HD 5870 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
46
79.838
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA
37
68.563
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
41
66.527
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA
37
61.937
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA
33
53.314
ATI HD 5770 1GB (Vapor-X)
2560×1600 – Max Detail, 0xAA
38
61.907
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA
36
60.337
NVIDIA GTS 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA
30
53.253
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA
28
50.727
ATI HD 5670 1GB (Sapphire)
1920×1080 – Max Detail, 4xAA
28
46.291
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA
24
43.96

The performance from both cards proved limiting, but as long as you are not ultra-fussy about the smoothest gameplay on the planet, running the game at 1080p with 4xAA should be just fine. If it isn’t, dropping AA will still deliver great-looking gameplay, and a nice Avg FPS boost.

Call of Juarez: Bound in Blood

When the original Call of Juarez was released, it brought forth something unique… a western-styled first-person shooter. That’s simply not something we see too often, so for fans of the genre, its release was a real treat. Although it didn’t really offer the best gameplay we’ve seen from a recent FPS title, its storyline and unique style made it well-worth testing.

After we retired the original title from our suite, we anxiously awaited for the sequel, Bound in Blood, in hopes that the series could be re-introduced into our testing once again. Thankfully, it could, thanks in part to its fantastic graphics, which are based around the Chrome Engine 4, and improved gameplay of the original. It was also well-received by game reviewers, which is always a good sign.

Manual Run-through: The level chosen here is Chapter I, and our starting point is about 15 minutes into the mission, where we stand atop a hill that overlooks a large river. We make our way across the hill and ultimately through a large trench, and we stop our benchmarking run shortly after we blow up a gas-filled barrel.

Unlike our Modern Warfare 2 test, we’re seeing no discernible difference between the 512MB and 1GB versions of the card, and overall, the 512MB version delivered higher minimum FPS on average. I’m sure this is just chance, as a 1GB card shouldn’t be slower in any regard, but either way, we’re seeing no difference here. Aside from that, the performance is solid all-around.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail
58
81.945
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail
37
80.339
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail
51
69.165
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail
45
54.428
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail
41
51.393
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail
36
51.334
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail
31
46.259
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail
28
45.028
ATI HD 5770 1GB (Vapor-X)
2560×1600 – Max Detail
30
44.98
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail
35
44.023
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail
27
38.686
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail
25
33.751
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail
38
47.23
ATI HD 5670 1GB (Sapphire)
1920×1080 – Max Detail
35
47.14

This may be a $100 card, but it can still handle Bound in Blood at 1080p with maxed-out detail settings. Like so many PC games on the market, this one happens to look great and work great on a variety of configurations. With that said, let’s move onto a game that’s a complete vice versa… Crysis Warhead.

Crysis Warhead

Like Call of Duty, Crysis is another series that doesn’t need much of an introduction. Thanks to the fact that almost any comments section for a PC performance-related article asks, “Can it run Crysis?”, even those who don’t play computer games no doubt know what Crysis is. When Crytek first released Far Cry, it delivered an incredible game engine with huge capabilities, and Crysis simply took things to the next level.

Although the sequel, Warhead, has been available for just about a year, it still manages to push the highest-end systems to their breaking-point. It wasn’t until this past January that we finally found a graphics solution to handle the game at 2560×1600 at its Enthusiast level, but even that was without AA! Something tells me Crysis will be de facto for GPU benchmarking for the next while.

Manual Run-through: Whenever we have a new game in-hand for benchmarking, we make every attempt to explore each level of the game to find out which is the most brutal towards our hardware. Ironically, after spending hours exploring this game’s levels, we found the first level in the game, “Ambush”, to be the hardest on the GPU, so we stuck with it for our testing. Our run starts from the beginning of the level and stops shortly after we reach the first bridge.

These results aren’t too much of a surprise, given that Crysis is one of the most gluttonous games on the market. Even at 1680×1050, the performance was tar-like. Given that, what’s our best playable?

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Gamer, 0xAA
19
40.381
ATI HD 5870 1GB (Reference)
2560×1600 – Gamer, 0xAA
20
32.955
ATI HD 5850 1GB (ASUS)
2560×1600 – Mainstream, 0xAA
28
52.105
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
27
50.073
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Mainstream, 0xAA
24
47.758
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Mainstream, 0xAA
21
40.501
ATI HD 4890 1GB (Sapphire)
2560×1600 – Mainstream, 0xAA
19
39.096
ATI HD 4870 1GB (Reference)
2560×1600 – Mainstream, 0xAA
20
35.257
ATI HD 5770 1GB (Vapor-X)
2560×1600 – Mainstream, 0xAA
19
35.923
ATI HD 5770 1GB (Reference)
2560×1600 – Mainstream, 0xAA
20
35.256
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
18
34.475
ATI HD 5750 1GB (Sapphire)
1920×1080 – Mainstream, 0xAA
21
47.545
ATI HD 5670 1GB (Sapphire)
1920×1080 – Mainstream, 0xAA
20
35.367
ATI HD 5670 512MB (Reference)
1920×1080 – Mainstream, 0xAA
20
35.103

Simply downgrading the Gamer profile to Mainstream made all the difference in the world, enabling us to game on both versions of the card at 1080p with 35 FPS on average. The 20 minimum FPS seems a tad low, but as you can see, the FPS in this title gets low on any card, and it’s really not all too noticeable during gameplay. Warhead is one of the few games where 20 FPS is actually playable, so anything towards the 30 FPS and up is great.

F.E.A.R. 2: Project Origin

Five out of the seven current games we use for testing are either sequels, or titles in an established series. F.E.A.R. 2 is one of the former, following up on the very popular First Encounter Assault Recon, released in fall of 2005. This horror-based first-person shooter brought to the table fantastic graphics, ultra-smooth gameplay, the ability to blow massive chunks out of anything, and also a very fun multi-player mode.

Three-and-a-half years later, we saw the introduction of the game’s sequel, Project Origin. As we had hoped, this title improved on the original where gameplay and graphics were concerned, and it was a no-brainer to want to begin including it in our testing. The game is gorgeous, and there’s much destruction to be had (who doesn’t love blowing expensive vases to pieces?). The game is also rather heavily scripted, which aides in producing repeatable results in our benchmarking.

Manual Run-through: The level used for our testing here is the first in the game, about ten minutes in. The scene begins with a travel up an elevator, with a robust city landscape behind us. Our run-through begins with a quick look at this cityscape, and then we proceed through the level until the point when we reach the far door as seen in the above screenshot.

Continuing a theme, neither the 512MB or 1GB card prove to be the dominant model, with virtually no difference in performance whatsoever.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
45
95.767
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
65
91.34
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
51
73.647
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
39
62.014
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
37
57.266
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA, 16xAF
38
56.726
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
34
50.555
ATI HD 5770 1GB (Vapor-X)
2560×1600 – Max Detail, 4xAA, 16xAF
33
48.356
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA, 16xAF
29
48.110
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
31
47.411
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 16xAF
27
39.563
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
24
36.331
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA, 16xAF
31
46.87
ATI HD 5670 1GB (Sapphire)
1920×1080 – Max Detail, 4xAA, 16xAF
31
46.433

As hoped, the game ran fine on the HD 5670 at 1080p, even with 4xAA intact.

Race Driver: GRID

If you primarily play games on a console, your choices for quality racing games are plenty. On the PC, that’s not so much the case. While there are a good number, there aren’t enough for a given type of racing game, from sim, to arcade. So when Race Driver: GRID first saw its release, many gamers were excited, and for good reason. It’s not a sim in the truest sense of the word, but it’s certainly not arcade, either. It’s somewhere in between.

The game happens to be great fun, though, and similar to console games like Project Gotham Racing, you need a lot of skill to succeed at the game’s default difficulty level. And like most great racing games, GRID happens to look absolutely stellar, and each of the game’s locations look very similar to their real-world counterparts. All in all, no racing fan should ignore this one.

Manual Run-through: For our testing here, we choose the city where both Snoop Dogg and Sublime hit their fame, the LBC, also known as Long Beach City. We choose this level because it’s not overly difficult, and also because it’s simply nice to look at. Our run consists of an entire 2-lap race, with the cars behind us for almost the entire race.

The HD 5670 continues to deliver about 30% less performance compared to the HD 5750, which is in all cases fairly accurate of the price scaling.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA
87
106.43
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 4xAA
84
103.958
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA
68
84.732
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
57
70.797
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
54
66.042
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA
52
63.617
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail, 4xAA
51
63.412
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 4xAA
45
56.980
ATI HD 5770 1GB (Vapor-X)
2560×1600 – Max Detail, 4xAA
42
56.665
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA
45
54.809
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
39
47.05
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
35
43.663
ATI HD 5670 1GB (Sapphire)
1920×1080 – Max Detail, 4xAA
39
47.679
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA
36
47.36

GRID is yet another game that runs just fine at max detail with a 1080p resolution and 4xAA.

World in Conflict: Soviet Assault

I admit that I’m not a huge fan of RTS titles, but World in Conflict intrigued me from the get go. After all, so many war-based games continue to follow the same story-lines we already know, and WiC was different. It counteracts the fall of the political and economic situation in the Soviet Union in the late 80’s, and instead provides a storyline that follows it as if the USSR had succeeded by proceeding with war in order to remain in power.

Many RTS games, with their advanced AI, tend to favor the CPU in order to deliver smooth gameplay, but WiC favors both the CPU and GPU, and the graphics prove it. Throughout the game’s missions, you’ll see gorgeous vistas and explore areas from deserts and snow-packed lands, to fields and cities. Overall, it’s a real visual treat for the eyes – especially since you’re able to zoom to the ground and see the action up-close.

Manual Run-through: The level we use for testing is the 7th campaign of the game, called Insurgents. Our saved game plants us towards the beginning of the mission with two squads of five, and two snipers. The run consists of bringing our men to action, and hovering the camera around throughout the duration. The entire run lasts between three and four minutes.

This game is a little misleading, as you wouldn’t expect it to be so strenuous on a GPU. But, thanks to it having so much going on at once, it is, and neither of our two resolutions here delivered what I would hope to be a playable FPS.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 8xAA, 16xAF
40
55.819
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
35
47.195
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
29
40.581
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 16xAF
34
49.514
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
36
46.186
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 16xAF
31
46.175
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
28
40.660
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 0xAA, 16xAF
23
39.365
ATI HD 5770 1GB (Vapor-X)
2560×1600 – Max Detail, 0xAA, 16xAF
28
37.511
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
28
37.389
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 4xAF
24
32.453
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 4xAF
23
31.769
ATI HD 5670 1GB (Sapphire)
1920×1080 – Max Detail, 0xAA, 16xAF
22
32.292
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 0xAA, 16xAF
21
31.872

Like some of our previous games, the anti-aliasing had to be dropped to increase our FPS to a high enough level for us to consider it as being playable. Even then, it wasn’t totally ideal, but I’m confident most gamers would be happy with a ~30FPS average here, rather than choose to downgrade their resolution.

Futuremark 3DMark Vantage

Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark Vantage is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.

The company first started out as MadOnion and released a GPU-benchmarking tool called XLR8R, which was soon replaced with 3DMark 99. Since that time, we’ve seen seven different versions of the software, including two major updates (3DMark 99 Max, 3DMark 2001 SE). With each new release, the graphics get better, the capabilities get better and the sudden hit of ambition to get down and dirty with overclocking comes at you fast.

Similar to a real game, 3DMark Vantage offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, High and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test.

Performance is the stock mode that most use when benchmarking, but it only uses a resolution of 1280×1024, which isn’t representative of today’s gamers. Extreme is more appropriate, as it runs at 1920×1200 and does well to push any single or multi-GPU configuration currently on the market – and will do so for some time to come.

3DMark Vantage is a highly scalable benchmark, taking full advantage of all available shaders and literal GPU cores, along with copious amounts of memory. Given that, our results above fairly accurately scale each card with its real-world performance.

Power & Temperatures

To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.

As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.

To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.

The power consumption difference between the two cards is almost non-existent, but as I mentioned in the intro, the GPU cooler that Sapphire uses on its card is a big improvement over the reference, and the top graph shows just how much. There’s a truly staggering 25°C drop at full load, and an 11°C drop at idle. That’s probably the greatest decrease we’ve ever seen from an upgraded pre-installed GPU cooler, so that’s quite impressive.

Final Thoughts

AMD has pulled off some great things lately. It not only delivered the HD 5000 series, which helped deliver unparalleled performance, and an appealing feature-set, and it’s not only delivering the world’s first $100 DirectX 11 card, but by how things are looking, it seems as though the company will complete its entire HD 5000 line-up of all price-points before NVIDIA can get out the door with its first Fermi card, which we’re hoping isn’t going to be more than two months from this point.

With the HD 5670’s release, AMD will be following-up with the launch of the HD 5500 series (exact models unknown, but probably HD 5570) and the HD 5450. The former will likely sell for around ~$70 and offer half of the performance of the HD 5750, while the HD 5450 should see a ~$50 price-point and 25% of the performance of the HD 5670. These are all personal estimations, but given the going trend so far, these are what I’d consider fairly good assumptions.

AMD hasn’t announced an exact date release for these upcoming cards, but they will see the light of day sometime next month, and very early at that (second week at the latest, by the looks of things). Should you be excited? It’s really hard to say. Surely, both the HD 5500/5400 series will be far improved solutions over an IGP, and at the price-point, they might very well be appealing to those looking to build a SFF PC or an HTPC. This could especially go for the HD 5450, as it will feature a passive design, and ultra-thrifty price-point.

Sapphire Radeon HD 5670 1GB

But what about the card on the table, the HD 5670? I mentioned in the intro that it was an odd beast, and I still feel that way. When AMD launched the HD 5750, it said that the 512MB version would sell for an SRP of $109. Today, that’s not the case, as all e-tailers I’ve checked have mostly focused on the 1GB version, with the 512MB selling for around $130. The 1GB’s sell for about $10 – $20 more (which is a little insane, given at that point, the HD 5770 doesn’t cost that much more, but would offer far improved performance).

If e-tailers stuck to the $109 price tag for the HD 5750 512MB, then this would be a very difficult conclusion to write. The reasons are obvious throughout our results. The performance of the HD 5750 is much improved over the HD 5670, yet its SRP is a mere $10 more. As far as I’m concerned, where this $10 is concerned there’s no reason to go with the HD 5670, unless you simply aren’t much of a gamer and want a decent GPU to help you get by. But at that point, it’d be wiser in my opinion to wait for the upcoming HD 5500/5400 series, since they will be less expensive, and probably live up to your expectations.

As the situation is today, though, the HD 5750 512MB isn’t $109, but rather $130, making the HD 5670 roll in at $30 cheaper. Throughout most of our performance results, we’ve seen rather equal performance scaling to price scaling, which is good to see. It means that for this GPU, you are really getting all the right bang for your buck, so at $100, the HD 5670 can be seen as a fantastic GPU solution. It allowed us to play all of our games at 1080p, which in itself is great, in addition to it having its other perks.

What perks? How about the superb power consumption and temperatures? Even with the reference cooler, the card idled at 42°C, which is lower than many of the other cards on our list. Then there’s the power consumption, which has no comparison (until we test out NVIDIA’s budget cards, at least). If you want the ultimate cool card, then Sapphire’s offering looks quite appealing, as it capped at 59°C during the ultra-stressful OCCT test. What’s that mean? This is a fantastic HTPC card.

Of course, Sapphire’s 1GB card doesn’t cost $99.99, but rather $120 at one e-tailer I checked. So given that, it’s hard to outright recommend this particular density, since we saw very little and next-to-nothing for performance gains (the lone exception was Modern Warfare 2). Fortunately, Sapphire’s 512MB version includes the exact same cooler, and is selling for $99.99. For its price-point, the HD 5670 is a solid card as is, but with its effective cooler, Sapphire’s card easily earns our Editor’s Choice.

Sapphire Radeon HD 5670 512MB
Sapphire’s Radeon HD 5670 512MB

Discuss this article in our forums!

Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!

Copyright © 2005-2019 Techgage Networks Inc. - All Rights Reserved.