Techgage logo

Sapphire Radeon HD 5750

Date: November 23, 2009
Author(s): Rob Williams

For a graphics card that retails for a suggested $130, the Radeon HD 5750 sure packs in a lot of features. In addition to its solid performance and superb power consumption, the card supports multiple monitor outputs, DirectX 11, Eyefinity and more. To top it all off, Sapphire includes a voucher for a free copy of Dirt 2 right in the box.



Introduction

Over the course of the past two months, AMD has launched five different models as part of its HD 5000 series. The company kicked things off with its higher-end HD 5850 and HD 5870 cards, which simply put, reaffirmed the fact that the roaring success of the HD 4000 series wasn’t going to be the last. Just last week, AMD ushered in the launch of its first dual-GPU card as part of the HD 5000 series, the HD 5970, and it’s mind-bogglingly fast, especially when compared to NVIDIA’s current offerings.

While high-end cards are all fine and good for those who need the kind of powered offered, there exists an even stronger market for lower-end components, and that’s where the HD 5770 and HD 5750 come into play. I took a look at the former about a month ago, and was impressed with the overall value. No matter how you looked at it, the card offered fantastic performance, a lower power consumption (and lower temps as a result), along with such perks as DirectX 11. There wasn’t a single aspect not to like – well, except for the overclocking potential.

With this article, we’re rounding out our content of AMD’s latest single-GPU cards, with a review of its dual-GPU HD 5970 to be posted within the next couple of weeks. As mentioned before, the HD 5750 launched at the same time as the HD 5770, and both are targeting the crowd of people who want acceptable gaming performance without breaking their budget. At the suggested retail price of $129 ($109 for the 512MB version), it’s undeniably affordable, but do better options exist?

Closer Look

Like the other members of the HD 5000 series, the HD 5750 is built upon a 40nm process, which is a nice improvement over the 55nm process of the previous generation. As a result of this shrink, and other architecture enhancements, AMD has seemingly mastered the art of building graphics cards that are much more power efficient, and we’ve seen this with each HD 5000 series card we’ve taken a look at up to this point.

In addition, the HD 5750, though a “budget” offering, lacks nothing of what makes the HD 5000 series so great. That means there’s full support for DirectX 11, Eyefinity (multi-monitor), multiple video outputs and so forth. From a features standpoint, it’s all here. What’s lacking is of course the incredible performance of the larger cards. Both the HD 5750 and HD 5770 have had their memory bus downgraded to 128-bit, while at the same time, they lose a substantial number of shader cores (exactly half of the 58×0 for each respective model).

Model
Core MHz
Mem MHz
Memory
Bus Width
Processors
Radeon HD 5970
725
1000
2048MB
256-bit
1600 x 2
Radeon HD 5870
850
1200
1024MB
256-bit
1600
Radeon HD 5850
725
1000
1024MB
256-bit
1440
Radeon HD 5770
850
1200
1024MB
128-bit
800
Radeon HD 5750
700
1150
512 – 1024MB
128-bit
720
Radeon HD 4890
850 – 900
975
1024MB
256-bit
800
Radeon HD 4870
750
900
512 – 2048MB
256-bit
800
Radeon HD 4850
625
993
512 – 1024MB
256-bit
800
Radeon HD 4770
750
800
512MB
128-bit
640
Radeon HD 4670
750
900 – 1100
512 – 1024MB
128-bit
320
Radeon HD 4650
600
400 – 500
512 – 1024MB
128-bit
320

At $109 for the 512MB version, and $129 for the 1GB version, the HD 5750 is without a doubt, a card designed for those who want performance far beyond what an integrated chip could offer, but doesn’t cost an arm and a leg. The HD 5750 is capable of delivering on all fronts in that regard. As you can see below, Sapphire changes things up from the reference design just a wee bit. the board itself is identical, but the cooler is a little more robust, with a larger heatsink base.

One common feature you can expect to see across almost all of the HD 5000 series cards are the connections found at the back which include 2 x DVI, 1x DisplayPort and 1x HDMI. The only thing lacking is VGA (understandably so), but if you are stuck with this ancient option, an adapter is found in the box.

The HD 5750, regardless of which vendor produces it, is simple in almost all regards. It has a simple cooler, and a single PCI-E power connector. Still, it’s a card of this generation, which means it boasts some great things, all of which we covered above. Sapphire’s card, like many others, includes a voucher for a free copy of Dirt 2 in the box. So, if that’s a title you were planning to pick up anyway, the GPU will cost less in the end (if you want to look at it that way).

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing. For an exhaustive look at our methodologies, even down to the Windows Vista installation, please refer to this article.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used. Each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, it will bring you to the product on the manufacturer’s website.

Component
Model
Processor
Intel Core i7-975 Extreme Edition – Quad-Core, 3.33GHz, 1.33v
Motherboard
Gigabyte GA-EX58-EXTREME – X58-based, F7 BIOS (05/11/09)
Memory
Corsair DOMINATOR – DDR3-1333 7-7-7-24-1T, 1.60
ATI Graphics Radeon HD 5870 1GB (Sapphire) – Catalyst 9.10
Radeon HD 5850 1GB (ASUS) – Catalyst 9.10
Radeon HD 5770 1GB (Reference) – Beta Catalyst (10/06/09)
Radeon HD 5750 1GB (Reference) – Catalyst 9.11
Radeon HD 4890 1GB (Sapphire) – Catalyst 9.8
Radeon HD 4870 1GB (Reference) – Catalyst 9.8
NVIDIA Graphics GeForce GTX 295 1792MB (Reference) – GeForce 186.18
GeForce GTX 285 1GB (EVGA) – GeForce 186.18
GeForce GTX 275 896MB (Reference) – GeForce 186.18
GeForce GTX 260 896MB (XFX) – GeForce 186.18
GeForce GTS 250 1GB (EVGA) – GeForce 186.18
Audio
On-Board Audio
Storage
Seagate Barracuda 500GB 7200.11
Power Supply
Corsair HX1000W
Chassis
SilverStone TJ10 Full-Tower
Display
Gateway XHD3000 30″
Cooling
Thermalright TRUE Black 120
Et cetera
Windows Vista Ultimate 64-bit

When preparing our testbeds for any type of performance testing, we follow these guidelines:

To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows Vista from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

For more robust information on how we tweak Windows, please refer once again to this article.

Game Titles

At this time, we currently benchmark all of our games using three popular resolutions: 1680×1050, 1920×1080 and also 2560×1600. 1680×1050 was chosen as it’s one of the most popular resolutions for gamers sporting ~20″ displays. 1920×1080 might stand out, since we’ve always used 1920×1200 in the past, but we didn’t make this change without some serious thought. After taking a look at the current landscape for desktop monitors around ~24″, we noticed that 1920×1200 is definitely on the way out, as more and more models are coming out as native 1080p. It’s for this reason that we chose it. Finally, for high-end gamers, we also benchmark using 2560×1600, a resolution that’s just about 2x 1080p.

For graphics cards that include less than 1GB of GDDR, we omit Grand Theft Auto IV from our testing, as our chosen detail settings require at least 800MB of available graphics memory. Also, if the card we’re benchmarking doesn’t offer the performance to handle 2560×1600 across most of our titles reliably, only 1680×1050 and 1920×1080 will be utilized.

Because we value results generated by real-world testing, we don’t utilize timedemos whatsoever. The possible exception might be Futuremark’s 3DMark Vantage. Though it’s not a game, it essentially acts as a robust timedemo. We choose to use it as it’s a standard where GPU reviews are concerned, and we don’t want to rid our readers of results they expect to see.

All of our results are captured with the help of Beepa’s FRAPS 2.98, while stress-testing and temperature-monitoring is handled by OCCT 3.1.0 and GPU-Z, respectively.

Call of Duty: Modern Warfare 2

Call of Juarez: Bound in Blood

Crysis Warhead

F.E.A.R. 2: Project Origin

Grand Theft Auto IV

Race Driver: GRID

World in Conflict: Soviet Assault

Call of Duty: Modern Warfare 2

When the original Call of Duty game launched in 2003, Infinity Ward was an unknown. Naturally… it was the company’s first title. But since then, the series and company alike have become household names. Not only has the series delivered consistently incredible gameplay, it’s pushed the graphics envelope with each successive release, and where Modern Warfare is concerned, it’s also had a rich storyline.

The first two titles might have been built on the already-outdated Quake III engine, but since then, the games have been built with improved graphical features, capable of pushing the highest-end PCs out there. Modern Warfare 2 is the first such exception, as it’s more of a console port than a true PC title. Therefore, the game doesn’t push PC hardware as much as we’d like to see, but despite that, it still looks great, and lacks little in the graphics department. You can read our review of the game here.

Manual Run-through: The level chosen is the 10th mission in the game, “The Gulag”. Our teams fly in helicopters up to an old prison with the intention of getting closer to finding the game’s villain, Vladimir Makarov. Our saved game file begins us at the point when the level name comes on the screen, right before we reach the prison, and it ends after one minute of landing, following the normal progression of the level. The entire run takes around two-and-a-half minutes.

From a technical standpoint, the HD 5750 should be on par with NVIDIA’s GeForce GTS 250, but it fell behind in our test here. I admit I was a bit surprised by this, but repeated testing delivered identical results. That doesn’t mean the HD 5750 is a chump, because it still delivered over 60 FPS at 1920×1080 with 4xAA, which means it has what it takes to power what the majority of gamers out there want to see.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
46
79.838
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA
37
68.563
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
41
66.527
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA
37
61.937
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA
33
53.314
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA
36
60.337
NVIDIA GTS 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA
30
53.253
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA
28
50.727

With the previous Call of Duty we regularly tested with, World at War, I found that anything between 30 – 40 FPS was deemed playable. Things change a bit with Modern Warfare 2, however, and I find that even 45 FPS can deliver sticky frames and less-than-ideal gameplay. But as soon as 50 FPS is hit, the game runs without a hitch. As a result, we had to disable the anti-aliasing, which boosted the performance to almost exactly that on our HD 5750.

Call of Juarez: Bound in Blood

When the original Call of Juarez was released, it brought forth something unique… a western-styled first-person shooter. That’s simply not something we see too often, so for fans of the genre, its release was a real treat. Although it didn’t really offer the best gameplay we’ve seen from a recent FPS title, its storyline and unique style made it well-worth testing.

After we retired the original title from our suite, we anxiously awaited for the sequel, Bound in Blood, in hopes that the series could be re-introduced into our testing once again. Thankfully, it could, thanks in part to its fantastic graphics, which are based around the Chrome Engine 4, and improved gameplay of the original. It was also well-received by game reviewers, which is always a good sign.

Manual Run-through: The level chosen here is Chapter I, and our starting point is about 15 minutes into the mission, where we stand atop a hill that overlooks a large river. We make our way across the hill and ultimately through a large trench, and we stop our benchmarking run shortly after we blow up a gas-filled barrel.

The situation that we discovered on the previous page reversed itself here, with the HD 5750 surpassing the performance of the GTS 250 by a rather fair margin. Once again, at our 1080p resolution, the card pushed over 60 FPS. Considering that Bound in Blood isn’t an old game, and looks great, seeing this kind of performance on a $130 card is impressive.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail
58
81.945
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail
37
80.339
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail
51
69.165
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail
45
54.428
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail
41
51.393
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail
36
51.334
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail
31
46.259
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail
28
45.028
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail
35
44.023
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail
27
38.686
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail
25
33.751

Common sense will tell you that the more frames your computer can render per second, the smoother the gameplay. When looking at it from that perspective, ~40 FPS doesn’t seem that impressive, and with regards to our previously tested game, it wouldn’t be entirely acceptable. Bound in Blood is a little different, though, and anything above 30 FPS will prove playable for most people. It isn’t the most fluid at that frame rate, but I believe it’d be acceptable for most people. If worse comes to worst, all it takes is a disabling of the anti-aliasing to speed things up.

Crysis Warhead

Like Call of Duty, Crysis is another series that doesn’t need much of an introduction. Thanks to the fact that almost any comments section for a PC performance-related article asks, “Can it run Crysis?”, even those who don’t play computer games no doubt know what Crysis is. When Crytek first released Far Cry, it delivered an incredible game engine with huge capabilities, and Crysis simply took things to the next level.

Although the sequel, Warhead, has been available for just about a year, it still manages to push the highest-end systems to their breaking-point. It wasn’t until this past January that we finally found a graphics solution to handle the game at 2560×1600 at its Enthusiast level, but even that was without AA! Something tells me Crysis will be de facto for GPU benchmarking for the next while.

Manual Run-through: Whenever we have a new game in-hand for benchmarking, we make every attempt to explore each level of the game to find out which is the most brutal towards our hardware. Ironically, after spending hours exploring this game’s levels, we found the first level in the game, “Ambush”, to be the hardest on the GPU, so we stuck with it for our testing. Our run starts from the beginning of the level and stops shortly after we reach the first bridge.

Here’s where our results get to be a bit strange. Because Crysis usually tends to favor ATI cards just a wee bit, I expected to see the HD 5750 surpass the GTS 250 here. These results confused me so much, that I looked around the web, and sure enough, the HD 5750 beat out the GTS 250 in every set of results I found. I can’t explain the oddity here, but given that our 3DMark Vantage scores lined-up with what we expected to see, it’s hard to understand.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Gamer, 0xAA
19
40.381
ATI HD 5870 1GB (Reference)
2560×1600 – Gamer, 0xAA
20
32.955
ATI HD 5850 1GB (ASUS)
2560×1600 – Mainstream, 0xAA
28
52.105
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
27
50.073
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Mainstream, 0xAA
24
47.758
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Mainstream, 0xAA
21
40.501
ATI HD 4890 1GB (Sapphire)
2560×1600 – Mainstream, 0xAA
19
39.096
ATI HD 4870 1GB (Reference)
2560×1600 – Mainstream, 0xAA
20
35.257
ATI HD 5770 1GB (Reference)
2560×1600 – Mainstream, 0xAA
20
35.256
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
18
34.475
ATI HD 5750 1GB (Sapphire)
1920×1080 – Mainstream, 0xAA
21
47.545

If the HD 5750 lived up to my performance expectations, it would have been able to handle the game at 2560×1600 with no issue, but since it couldn’t, it fell behind the GTS 250 once again, and had to be pushed back to 1920×1080 with the Mainstream profile.

F.E.A.R. 2: Project Origin

Five out of the seven current games we use for testing are either sequels, or titles in an established series. F.E.A.R. 2 is one of the former, following up on the very popular First Encounter Assault Recon, released in fall of 2005. This horror-based first-person shooter brought to the table fantastic graphics, ultra-smooth gameplay, the ability to blow massive chunks out of anything, and also a very fun multi-player mode.

Three-and-a-half years later, we saw the introduction of the game’s sequel, Project Origin. As we had hoped, this title improved on the original where gameplay and graphics were concerned, and it was a no-brainer to want to begin including it in our testing. The game is gorgeous, and there’s much destruction to be had (who doesn’t love blowing expensive vases to pieces?). The game is also rather heavily scripted, which aides in producing repeatable results in our benchmarking.

Manual Run-through: The level used for our testing here is the first in the game, about ten minutes in. The scene begins with a travel up an elevator, with a robust city landscape behind us. Our run-through begins with a quick look at this cityscape, and then we proceed through the level until the point when we reach the far door as seen in the above screenshot.

Things were beginning to look a bit rough for the HD 5750 with our Crysis Warhead test, but F.E.A.R. 2 is helping the card’s road to recovery. Here, the card is consistently faster than the GTS 250. This again isn’t a major surprise, since like Crysis Warhead, F.E.A.R. 2 usually favors ATI cards.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
45
95.767
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
65
91.34
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
51
73.647
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
39
62.014
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
37
57.266
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA, 16xAF
38
56.726
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
34
50.555
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA, 16xAF
29
48.110
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
31
47.411
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 16xAF
27
39.563
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
24
36.331

F.E.A.R. 2 might be a leading PC title where graphics are concerned, but it doesn’t require an ultra-powerful machine for a gamer to experience it at high-end settings. Thanks to this, all of our cards, ranging from $130 up to $500 are able to handle the game at the max detail settings. Better cards will make the game more fluid, of course, but none of these throttle gameplay at those settings.

Grand Theft Auto: IV

If you look up the definition for “controversy”, Grand Theft Auto should be listed. If it’s not, then that should be a crime, because throughout GTA’s many titles, there’s been more of that than you can shake your fist at. At the series’ beginning, the games were rather simple, and didn’t stir up too much passion in certain opposers. But once GTA III and its successors came along, its developers enjoyed all the controversy that came their way, and why not? It helped spur incredible sales numbers.

Grand Theft Auto IV is yet another continuation in the series, though it follows no storyline from the previous titles. Liberty City, loosely based off of New York City, is absolutely huge, with much to explore. This is so much so the case, that you could literally spend hours just wandering around, ignoring the game’s missions, if you wanted to. It also happens to be incredibly stressful on today’s computer hardware, similar to Crysis.

Manual Run-through: After the first minor mission in the game, you reach an apartment. Our benchmarking run starts from within this room. From here, we run out the door, down the stairs and into an awaiting car. We then follow a specific path through the city, driving for about three minutes total.

We’re seeing a deja vu here, with the HD 5750 performing worse than the GTS 250. Again, it’s kind of surprising, since both cards are technically on par, or at least extremely close, with one another.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600, H/H/VH/H/VH Detail
27
52.590
ATI HD 5870 1GB (Reference)
2560×1600, H/H/VH/H/VH Detail
29
45.767
NVIDIA GTX 260 896MB (GBT SOC)
2560×1600 – High Detail
30
46.122
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – High Detail
32
45.573
NVIDIA GTX 275 896MB (Reference)
2560×1600 – High Detail
30
44.703
NVIDIA GTX 260 896MB (XFX)
2560×1600 – High Detail
24
38.492
ATI HD 5850 1GB (ASUS)
1920×1080, High Detail
27
42.102
ATI HD 4890 1GB (Sapphire)
1920×1080 – High Detail
32
50.300
NVIDIA GTX 250 1GB (EVGA)
1920×1080 – High Detail
34
49.443
ATI HD 4870 1GB (Reference)
1920×1080 – High Detail
33
48.738
ATI HD 5770 1GB (Reference)
1920×1080 – High Detail
33
47.719
ATI HD 5750 1GB (Sapphire)
1920×1080 – High Detail
27
39.904

Similar to its bigger brother, the HD 5770, the resolution had to be dropped to 1920×1080 in order to enjoy the game to its fullest.

Race Driver: GRID

If you primarily play games on a console, your choices for quality racing games are plenty. On the PC, that’s not so much the case. While there are a good number, there aren’t enough for a given type of racing game, from sim, to arcade. So when Race Driver: GRID first saw its release, many gamers were excited, and for good reason. It’s not a sim in the truest sense of the word, but it’s certainly not arcade, either. It’s somewhere in between.

The game happens to be great fun, though, and similar to console games like Project Gotham Racing, you need a lot of skill to succeed at the game’s default difficulty level. And like most great racing games, GRID happens to look absolutely stellar, and each of the game’s locations look very similar to their real-world counterparts. All in all, no racing fan should ignore this one.

Manual Run-through: For our testing here, we choose the city where both Snoop Dogg and Sublime hit their fame, the LBC, also known as Long Beach City. We choose this level because it’s not overly difficult, and also because it’s simply nice to look at. Our run consists of an entire 2-lap race, with the cars behind us for almost the entire race.

Like our Crysis Warhead results, I find the results above a bit strange. ATI excels in GRID, so to see the HD 5750 fall behind the GTS 250 is bizarre. Things changed at 2560×1600, but that makes the issue even stranger.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA
87
106.43
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 4xAA
84
103.958
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA
68
84.732
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
57
70.797
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
54
66.042
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA
52
63.617
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail, 4xAA
51
63.412
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 4xAA
45
56.980
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA
45
54.809
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
39
47.05
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
35
43.663

The GTS 250 might surpass the HD 5750 in performance in GRID, but both come out to about the same in the “best playable” arena, with both being just fine to stay at the games top settings with 4xAA.

World in Conflict: Soviet Assault

I admit that I’m not a huge fan of RTS titles, but World in Conflict intrigued me from the get go. After all, so many war-based games continue to follow the same story-lines we already know, and WiC was different. It counteracts the fall of the political and economic situation in the Soviet Union in the late 80’s, and instead provides a storyline that follows it as if the USSR had succeeded by proceeding with war in order to remain in power.

Many RTS games, with their advanced AI, tend to favor the CPU in order to deliver smooth gameplay, but WiC favors both the CPU and GPU, and the graphics prove it. Throughout the game’s missions, you’ll see gorgeous vistas and explore areas from deserts and snow-packed lands, to fields and cities. Overall, it’s a real visual treat for the eyes – especially since you’re able to zoom to the ground and see the action up-close.

Manual Run-through: The level we use for testing is the 7th campaign of the game, called Insurgents. Our saved game plants us towards the beginning of the mission with two squads of five, and two snipers. The run consists of bringing our men to action, and hovering the camera around throughout the duration. The entire run lasts between three and four minutes.

Here we have a repeat of the results seen in the previous page, with the HD 5750 falling short in our lower-resolution tests, but coming out a smidgen ahead at 2560×1600.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 8xAA, 16xAF
40
55.819
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
35
47.195
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
29
40.581
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 16xAF
34
49.514
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
36
46.186
ATI HD 4890 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 16xAF
31
46.175
ATI HD 4870 1GB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
28
40.660
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 0xAA, 16xAF
23
39.365
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
28
37.389
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 4xAF
24
32.453
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 4xAF
23
31.769

Like the majority of cards here, the 2560×1600 resolution could be retained, but the anti-aliasing could not. In the end, ~30 FPS isn’t an ideal frame rate, but it’s sufficient to run the game at such a high resolution.

Futuremark 3DMark Vantage

Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark Vantage is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.

The company first started out as MadOnion and released a GPU-benchmarking tool called XLR8R, which was soon replaced with 3DMark 99. Since that time, we’ve seen seven different versions of the software, including two major updates (3DMark 99 Max, 3DMark 2001 SE). With each new release, the graphics get better, the capabilities get better and the sudden hit of ambition to get down and dirty with overclocking comes at you fast.

Similar to a real game, 3DMark Vantage offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, High and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test.

Performance is the stock mode that most use when benchmarking, but it only uses a resolution of 1280×1024, which isn’t representative of today’s gamers. Extreme is more appropriate, as it runs at 1920×1200 and does well to push any single or multi-GPU configuration currently on the market – and will do so for some time to come.

3DMark Vantage is a highly scalable benchmark, taking full advantage of all available shaders and literal GPU cores, along with copious amounts of memory. Given that, our results above fairly accurately scale each card with its real-world performance.

Final Thoughts

At this point in most of our graphics card reviews, I post information pertaining to overclocking, power consumption, and temperatures. Unfortunately, before we could wrap up our testing, the card died for some reason, so I was unable to complete what other testing I had left. I am not sure what to make of the issue, or speculate as to how it happened, but I don’t blame the card, model, or Sapphire. It seems like a freak accident.

After the performance benchmarking was done, I shut the PC down in order to come back to it later to deal with the overclocking and power consumption tests, but on the next boot, I had no video at all. Our motherboard is equipped with an Award BIOS, and the error code given was 92. I searched all over the Internet to find information on what that could mean, but all I could find was “Reserved”. That’s not too helpful.

Regardless, I’m glad I was able to finish up the performance testing before the card went kaput. I expect to be able to replace the HD 5750 in the near future, and when I do, I’ll wrap up the power and temperature tests, and also overclocking, but I’ll post information related to that in our news section, rather than update this review (I’ll update this review with a link to our news post). I need to stress that I don’t believe this issue to be a fault with the card or Sapphire, although I’m not sure where to shift the blame. We’ve not had a GPU just die on us, at least recently, so I don’t believe it’s any of our equipment.

That all said, given the performance information we have here, I have to admit that I expected a bit more from the HD 5750. It’s not at all a bad card, but some of our results left me a bit confused, like Crysis Warhead. There, the HD 5750 should have come out ahead of the GTS 250, but didn’t. Of course, the card did end up dying on me, which would usually lead me to believe that there was an underlying issue the entire time, but repeated runs of 3DMark Vantage did deliver expected results, and there, the card always came out ahead of the GTS 250.

With an SRP of $130, the HD 5750 is a solid card. The GTS 250 also hovers around the same price-point, and both cards flip-flopped their strengths and weaknesses, with the overall nod going to NVIDIA. Choosing between either of these two cards comes down to a few things, namely, whether you desire for the utmost performance, or don’t mind sacrificing some of it in lieu of some great features.

Though I was unable to test the HD 5750 for power consumption and temperatures, based on what I’ve seen from other HD 5000 series cards, its power consumption would be far below the GTS 250. In fact, as we saw in our review of the HD 5770, even that card is much more power efficient… sucking down 68W less at max load. The difference between the HD 5750 and GTS 250 would be even greater. The same goes for the temperatures.

Other bonuses of the HD 5750 include DirectX 11, which should come in handy over the next few years (Dirt 2 and STALKER: Call of Pripyat are two soon-to-be-released titles), Eyefinity (superb multi-monitor support), and of course, the multi-monitor outputs. You might not have a desktop monitor equipped with HDMI or DisplayPort, but all of ATI’s HD 5000 cards come with the support, so it’s there for when you need it.

What hurts the HD 5750, and most other HD 5000 series cards right now, is the price-gouging by popular e-tailers. It’s not just one, but multiple, so it doesn’t look like any are going to budge soon. Because of this, the GTS 250 looks like an even better buy, as it can be had for $130 or even a bit less… the same price that the HD 5750 is supposed to be sold for. Unfortunately, AMD has no control over this.

So as it is now, the smarter card would be the HD 5750, because it offers some perks that make it a vast improvement over the GTS 250, but for the best raw performance, the GTS 250 is the right card. One last thing I’ll mention, though, is that on most e-tailers, the HD 5770 isn’t too much more than the HD 5750 ($20 more, tops). If it came down to a choice between the price-gouged HD 5750 or HD 5770, I’d recommend the latter.

Discuss this article in our forums!

Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!

Copyright © 2005-2018 Techgage Networks Inc. - All Rights Reserved.