Techgage logo

ATI’s Radeon HD 5450 – The Perfect HTPC Card?

Date: February 4, 2010
Author(s): Rob Williams

This past fall, AMD launched its latest graphics generation with the high-end HD 5870, and today, it looks to the opposite end of the spectrum with its $50 HD 5450. Though inexpensive, the HD 5450 has a surprising amount of spunk. Coupled with its passive design and full media capabilities, it looks to be the ideal solution for your HTPC.



Introduction

If it feels like we just took a look at a brand-new ATI graphics card mere weeks ago, don’t worry… we did. From the middle of last month to the middle of this month, AMD has a schedule to round-out its entire budget line-up, and although the Radeon HD 5670 sits comfortably at $100, the card we’re looking at today, and the HD 5570 due out next week, both retail for well under that price point.

The HD 5670 was a bit of a strange beast, because while it’s not what anyone would consider a mainstream gaming card, it delivered more than enough performance to handle all of today’s games at up to 1080p. For a $100 offering, the card had a great performance/watt ratio, and with after-market coolers, as we saw with Sapphire’s version of the card, the temperatures are ideal for any sort of use, including HTPC.

We’ll be tackling the HD 5570 later, but both it and the HD 5450 have a different focus than the HD 5670, as both cards cost well under $100, with the HD 5450 priced at around $50 – $60. I suspect that the latter pricing would be for 1GB cards, while the 512MB versions will stick close to an even $50. The main competition to the HD 5450 is NVIDIA’s GeForce 210 which currently retails for around $45 (mail-in rebates bring this price way down).

With its modest specs, the HD 5450 isn’t going to “wow” people with its gaming performance, but at its given price, it should appease all those who are looking for a quality HTPC card that will allow them to do a bit of light gaming, or take advantage of all of the popular casual games on the market, such as Spore, Sims 3, World of Warcraft and Solitaire. The best part of the card might be its low-profile nature, and its completely passive heatsink design.

ATI Radeon HD 5450 512MB

As you can see in the photograph above, we received three different versions of the Radeon HD 5450. The one in the middle, with the unique red heatsink, is AMD’s reference. The one on the left is Sapphire’s version, which is even thinner than AMD’s, and on the right is Gigabyte’s fanned model, which is also much thinner than the reference design.

To get this explanation over with, I won’t be benchmarking all three of these cards today, but instead will focus on just the reference. The reason is just because I find little value in benchmarking a handful of cards for such a low-end model, where the performance isn’t going to change all too much. This is where things become a little confusing, and I’m forced to explain the oddities.

For some reason, the reference card we received from AMD wasn’t completely reference. According to official press documents, the clock speed is 650MHz, while the memory is 800MHz. Our sample had a 900MHz memory clock, which means almost a 13% boost. I contacted AMD about the fact that what we received was essentially overclocked, and the person I talked to was surprised and couldn’t explain it. Somehow, all of the press received these slightly overclocked cards.

ATI Radeon HD 5450 - Official Specs

To add to the confusion just a bit more, our performance didn’t exactly match what the press deck we were given showed, either. Our card performed a lot better. But after talking to AMD, it stated that with that 100MHz boost to the memory, it would equate to about a 5.5% performance boost on average, so nothing major. It also mentioned that it suspects that most launch cards from other companies will not be 650/800, but would typically be pre-overclocked, like with up to a 900MHz memory clock.

Of course, I noticed the discrepancy after all of our benchmarking was completed. Like the reference card, Gigabyte’s model was also overclocked (+50MHz to the core) and I thought Sapphire’s was, but it turned out later on that I was wrong. For some reason there, GPU-Z showed that the card had a default clock of 650/900, but I mis-read it and thought those were the actual clocks.

Rather, Sapphire’s card was actually following the reference clocks. I’m uncertain as to the reason that there was ever a reference to a 900MHz memory clock, but if I had to guess, the HD 5450 was supposed to have a 900MHz clock, but for some reason, AMD backed it down to 800MHz. I’m unsure why, however, as the card at 900MHz memory was absolutely rock-stable throughout all of our tests.

Whew, well there’s five paragraphs I wish I didn’t have to add to the front-page of the review. Once again, we are benchmarking with the reference card, but it has a 100MHz boost to the memory. We’ll benchmark the other two cards, including Sapphire’s reference using 3DMark Vantage only, for the sake of ease.

Model
Core MHz
Mem MHz
Memory
Bus Width
Processors
Radeon HD 5970
725
1000
2048MB
256-bit
1600 x 2
Radeon HD 5870
850
1200
1024MB
256-bit
1600
Radeon HD 5850
725
1000
1024MB
256-bit
1440
Radeon HD 5770
850
1200
1024MB
128-bit
800
Radeon HD 5750
700
1150
512MB – 1GB
128-bit
720
Radeon HD 5670
775
1000
512MB – 1GB
128-bit
400
Radeon HD 5570
?
?
?
?
?
Radeon HD 5450
650
800
512MB – 1GB
64-bit
80

The HD 5450 is in all regards a very low-end card. It has 5% of the total number of cores as the HD 5870, a 64-bit memory bus, and the slowest clocks of the bunch. Unfortunately, I can’t reveal the HD 5570 clocks at this time, but I can assure you that the card will be faster than the HD 5450, and slower than the HD 5670 (it’s reasons like these that I lack friends).

I admit, that for a card that’s to sell for around $50, the reference version straight out of AMD has one of the coolest (no pun) passive coolers I’ve ever seen, if not the coolest. It’s unfortunate, though, because I’m not quite sure how adopted this will be. All I’ve seen of HD 5450’s from other vendors so far have been their own cooler designs. Whether they are cool, or do a good job at cooling, we’ll soon see.

ATI Radeon HD 5450 512MB

The reference card includes HDMI, DVI and VGA ports, and Gigabyte’s card below follows that. The Sapphire card sways from this just a wee bit by offering a DisplayPort connector in lieu of the HDMI.

Like AMD itself, Sapphire has also opted to go with a completely passive design, except its card is much, much thinner, and doesn’t at all block other slots. The cooler seen below extends just a wee bit onto the back, about an inch in, but is still thin enough that it won’t bother a PCI card behind it, if there is one there.

Sapphire Radeon HD 5450 512MB

Gigabyte follows similar goals as Sapphire by offering the thinnest card of all three. Here, the cooler doesn’t at all extend on the back, but the trade-off is that a minuscule fan is included. The downside to this implementation is obvious… noise. But during our tests, even when stressing the GPU with OCCT, we couldn’t hear much more than enough noise to know that the fan was even working… it was almost silent, with no whine whatsoever.

Gigabyte Radeon HD 5450 512MB

Let’s move right into a look at our test methodology, and then get right to the results.

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing. For an exhaustive look at our methodologies, even down to the Windows Vista installation, please refer to this article.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used. Each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, it will bring you to the product on the manufacturer’s website.

Component
Model
Processor
Intel Core i7-975 Extreme Edition – Quad-Core, 3.33GHz, 1.33v
Motherboard
Gigabyte GA-EX58-EXTREME – X58-based, F7 BIOS (05/11/09)
Memory
Corsair DOMINATOR – DDR3-1333 7-7-7-24-1T, 1.60
ATI Graphics Radeon HD 5870 1GB (Sapphire) – Catalyst 9.10
Radeon HD 5850 1GB (ASUS) – Catalyst 9.10
Radeon HD 5770 1GB (Reference) – Beta Catalyst (10/06/09)
Radeon HD 5750 1GB (Sapphire) – Catalyst 9.11
Radeon HD 5670 512MB (Reference) – Beta Catalyst (12/16/09)
Radeon HD 5450 512MB (Reference) – Beta Catalyst (12/11/09)
NVIDIA Graphics GeForce GTX 295 1792MB (Reference) – GeForce 186.18
GeForce GTX 285 1GB (EVGA) – GeForce 186.18
GeForce GTX 275 896MB (Reference) – GeForce 186.18
GeForce GTX 260 896MB (XFX) – GeForce 186.18
GeForce GTS 250 1GB (EVGA) – GeForce 186.18
GeForce GT 240 512MB (ASUS) – GeForce 196.21

GeForce GT 220 1GB (ASUS) – GeForce 196.21
GeForce 210 512MB (ASUS) – GeForce 196.21
Audio
On-Board Audio
Storage
Seagate Barracuda 500GB 7200.11
Power Supply
Corsair HX1000W
Chassis
SilverStone TJ10 Full-Tower
Display
Gateway XHD3000 30″
Cooling
Thermalright TRUE Black 120
Et cetera
Windows Vista Ultimate 64-bit

When preparing our testbeds for any type of performance testing, we follow these guidelines:

To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows Vista from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

For more robust information on how we tweak Windows, please refer once again to this article.

Game Titles

At this time, we currently benchmark all of our games using three popular resolutions: 1680×1050, 1920×1080 and also 2560×1600. 1680×1050 was chosen as it’s one of the most popular resolutions for gamers sporting ~20″ displays. 1920×1080 might stand out, since we’ve always used 1920×1200 in the past, but we didn’t make this change without some serious thought. After taking a look at the current landscape for desktop monitors around ~24″, we noticed that 1920×1200 is definitely on the way out, as more and more models are coming out as native 1080p. It’s for this reason that we chose it. Finally, for high-end gamers, we also benchmark using 2560×1600, a resolution that’s just about 2x 1080p.

For graphics cards that include less than 1GB of GDDR, we omit Grand Theft Auto IV from our testing, as our chosen detail settings require at least 800MB of available graphics memory. Also, if the card we’re benchmarking doesn’t offer the performance to handle 2560×1600 across most of our titles reliably, only 1680×1050 and 1920×1080 will be utilized.

Because we value results generated by real-world testing, we don’t utilize timedemos whatsoever. The possible exception might be Futuremark’s 3DMark Vantage. Though it’s not a game, it essentially acts as a robust timedemo. We choose to use it as it’s a standard where GPU reviews are concerned, and we don’t want to rid our readers of results they expect to see.

All of our results are captured with the help of Beepa’s FRAPS 2.98, while stress-testing and temperature-monitoring is handled by OCCT 3.1.0 and GPU-Z, respectively.

Special Note: Because we’re taking a look at very low-end cards here, we’ll be running the cards through special settings, not our usual set of three. We will include graphs for 1680×1050 results for the sake of comparison.

Call of Duty: Modern Warfare 2

Call of Juarez: Bound in Blood

Crysis Warhead

F.E.A.R. 2: Project Origin

World in Conflict: Soviet Assault

Call of Duty: Modern Warfare 2

When the original Call of Duty game launched in 2003, Infinity Ward was an unknown. Naturally… it was the company’s first title. But since then, the series and company alike have become household names. Not only has the series delivered consistently incredible gameplay, it’s pushed the graphics envelope with each successive release, and where Modern Warfare is concerned, it’s also had a rich storyline.

The first two titles might have been built on the already-outdated Quake III engine, but since then, the games have been built with improved graphical features, capable of pushing the highest-end PCs out there. Modern Warfare 2 is the first such exception, as it’s more of a console port than a true PC title. Therefore, the game doesn’t push PC hardware as much as we’d like to see, but despite that, it still looks great, and lacks little in the graphics department. You can read our review of the game here.

Manual Run-through: The level chosen is the 10th mission in the game, “The Gulag”. Our teams fly in helicopters up to an old prison with the intention of getting closer to finding the game’s villain, Vladimir Makarov. Our saved game file begins us at the point when the level name comes on the screen, right before we reach the prison, and it ends after one minute of landing, following the normal progression of the level. The entire run takes around two-and-a-half minutes.

As mentioned in the intro, AMD is targeting this card straight at the GeForce 210, while the upcoming HD 5570 will target the GT 220. So far, the HD 5450 is off to a great start, blowing far past the 210 at both resolutions. While on the 210, 1280×1024 isn’t playable, it is to a much greater degree on the HD 5450.

Please note, all 1680×1050 graphs in this article are included only for the sake of comparison.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5770 1GB CrossFireX
2560×1600 – Max Detail, 4xAA
40
81.311
ATI HD 5870 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
46
79.838
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA
37
68.563
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
41
66.527
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA
37
61.937
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA
33
53.314
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA
36
60.337
NVIDIA GTS 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA
30
53.253
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA
28
50.727
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA
24
43.96
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
30
53.139
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Low Detail, 0xAA
29
53.593
ATI HD 5450 512MB (Reference)
1280×1024 – Low Detail, 0xAA
26
36.032
NVIDIA 210 512MB (ASUS)
1280×1024 – Low Detail, 0xAA
18
29.885
Intel HD Graphics (Clarkdale)
1280×1024 – Low Detail, 0xAA
14
25.955

After removing some of the special effects (Shadows, Specular Map and Depth of Field), the performance of the HD 5450 shot skyward and gave us some much smoother gameplay.

Call of Juarez: Bound in Blood

When the original Call of Juarez was released, it brought forth something unique… a western-styled first-person shooter. That’s simply not something we see too often, so for fans of the genre, its release was a real treat. Although it didn’t really offer the best gameplay we’ve seen from a recent FPS title, its storyline and unique style made it well-worth testing.

After we retired the original title from our suite, we anxiously awaited for the sequel, Bound in Blood, in hopes that the series could be re-introduced into our testing once again. Thankfully, it could, thanks in part to its fantastic graphics, which are based around the Chrome Engine 4, and improved gameplay of the original. It was also well-received by game reviewers, which is always a good sign.

Manual Run-through: The level chosen here is Chapter I, and our starting point is about 15 minutes into the mission, where we stand atop a hill that overlooks a large river. We make our way across the hill and ultimately through a large trench, and we stop our benchmarking run shortly after we blow up a gas-filled barrel.

As mentioned in the past, ATI cards tend to excel at this particular title, so it’s not much of a surprise that the 210 is left in the dust. The HD 5450 doubled the minimum FPS and starkly increased the average.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail
58
81.945
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail
37
80.339
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail
51
69.165
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail
45
54.428
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail
41
51.393
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail
28
45.028
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail
35
44.023
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail
27
38.686
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail
25
33.751
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail
38
47.23
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
29
39.446
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Medium Detail, 0xAA
29
41.722
ATI HD 5450 512MB (Reference)
1280×1024 – Medium Detail, 0xAA
20
32.619
NVIDIA 210 512MB (ASUS)
1280×1024 – Low Detail, 0xAA
18
30.825

At the above-listed settings, the game played just fine. I have to stress that lowering the settings in this particular title makes for a starkly different (and mostly ugly) game, but at least we didn’t need to decrease our detail lower than medium.

Crysis Warhead

Like Call of Duty, Crysis is another series that doesn’t need much of an introduction. Thanks to the fact that almost any comments section for a PC performance-related article asks, “Can it run Crysis?”, even those who don’t play computer games no doubt know what Crysis is. When Crytek first released Far Cry, it delivered an incredible game engine with huge capabilities, and Crysis simply took things to the next level.

Although the sequel, Warhead, has been available for just about a year, it still manages to push the highest-end systems to their breaking-point. It wasn’t until this past January that we finally found a graphics solution to handle the game at 2560×1600 at its Enthusiast level, but even that was without AA! Something tells me Crysis will be de facto for GPU benchmarking for the next while.

Manual Run-through: Whenever we have a new game in-hand for benchmarking, we make every attempt to explore each level of the game to find out which is the most brutal towards our hardware. Ironically, after spending hours exploring this game’s levels, we found the first level in the game, “Ambush”, to be the hardest on the GPU, so we stuck with it for our testing. Our run starts from the beginning of the level and stops shortly after we reach the first bridge.

There are few things on this earth as horrible as running Crysis on a low-end graphics card, but for as far as that goes, the HD 5450 proved about 43% faster than the 210.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Gamer, 0xAA
19
40.381
ATI HD 5870 1GB (Reference)
2560×1600 – Gamer, 0xAA
20
32.955
ATI HD 5850 1GB (ASUS)
2560×1600 – Mainstream, 0xAA
28
52.105
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
27
50.073
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Mainstream, 0xAA
24
47.758
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Mainstream, 0xAA
21
40.501
ATI HD 5770 1GB (Reference)
2560×1600 – Mainstream, 0xAA
20
35.256
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
18
34.475
ATI HD 5750 1GB (Sapphire)
1920×1080 – Mainstream, 0xAA
21
47.545
ATI HD 5670 512MB (Reference)
1920×1080 – Mainstream, 0xAA
20
35.103
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Mainstream Detail, 0xAA
19
33.623
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Minimum Detail, 0xAA
26
44.286
ATI HD 5450 512MB (Reference)
1280×1024 – Minimum Detail, 0xAA
12
31.495
NVIDIA 210 512MB (ASUS)
1024×768 – Minimum Detail, 0xAA
15
29.501
Intel HD Graphics (Clarkdale)
1024×768 – Minimum Detail, 0xAA
10
24.289

Like the GT 220, we were able to retain our 1280×1024 resolution with the HD 5450 with the caveat being a decrease to the “Minimum” detail settings. At those settings, though, the game still looks remarkably good, and played very, very well, despite the recorded minimum FPS.

F.E.A.R. 2: Project Origin

Five out of the seven current games we use for testing are either sequels, or titles in an established series. F.E.A.R. 2 is one of the former, following up on the very popular First Encounter Assault Recon, released in fall of 2005. This horror-based first-person shooter brought to the table fantastic graphics, ultra-smooth gameplay, the ability to blow massive chunks out of anything, and also a very fun multi-player mode.

Three-and-a-half years later, we saw the introduction of the game’s sequel, Project Origin. As we had hoped, this title improved on the original where gameplay and graphics were concerned, and it was a no-brainer to want to begin including it in our testing. The game is gorgeous, and there’s much destruction to be had (who doesn’t love blowing expensive vases to pieces?). The game is also rather heavily scripted, which aides in producing repeatable results in our benchmarking.

Manual Run-through: The level used for our testing here is the first in the game, about ten minutes in. The scene begins with a travel up an elevator, with a robust city landscape behind us. Our run-through begins with a quick look at this cityscape, and then we proceed through the level until the point when we reach the far door as seen in the above screenshot.

The HD 5450 continues to shine in the test here, far surpassing the performance of the 210. There’s just no comparison.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
45
95.767
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
65
91.34
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
51
73.647
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
39
62.014
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
37
57.266
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA, 16xAF
29
48.110
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
31
47.411
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 16xAF
27
39.563
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
24
36.331
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA, 16xAF
31
46.87
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
30
45.039
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Medium Detail, 0xAA
22
29.869
ATI HD 5450 512MB (Reference)
1280×1024 – Medium Detail, 0xAA
17
27.149
NVIDIA 210 512MB (ASUS)
1280×1024 – Low Detail, 0xAA
17
28.569
Intel HD Graphics (Clarkdale)
1280×1024 – Low Detail, 0xAA
20
34.388

With the performance as good as it was with medium detail, we left that as our best playable.

World in Conflict: Soviet Assault

I admit that I’m not a huge fan of RTS titles, but World in Conflict intrigued me from the get go. After all, so many war-based games continue to follow the same story-lines we already know, and WiC was different. It counteracts the fall of the political and economic situation in the Soviet Union in the late 80’s, and instead provides a storyline that follows it as if the USSR had succeeded by proceeding with war in order to remain in power.

Many RTS games, with their advanced AI, tend to favor the CPU in order to deliver smooth gameplay, but WiC favors both the CPU and GPU, and the graphics prove it. Throughout the game’s missions, you’ll see gorgeous vistas and explore areas from deserts and snow-packed lands, to fields and cities. Overall, it’s a real visual treat for the eyes – especially since you’re able to zoom to the ground and see the action up-close.

Manual Run-through: The level we use for testing is the 7th campaign of the game, called Insurgents. Our saved game plants us towards the beginning of the mission with two squads of five, and two snipers. The run consists of bringing our men to action, and hovering the camera around throughout the duration. The entire run lasts between three and four minutes.

Like Call of Juarez, World in Conflict to some degree favors ATI cards, but even with that, the HD 5450 looks to be nearly twice as fast as the 210.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 8xAA, 16xAF
40
55.819
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
35
47.195
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
29
40.581
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 16xAF
34
49.514
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
36
46.186
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 0xAA, 16xAF
23
39.365
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
28
37.389
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 4xAF
24
32.453
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 4xAF
23
31.769
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
22
33.788
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 0xAA, 16xAF
21
31.872
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Medium Detail, 0xAA
41
52.089
ATI HD 5450 512MB (Reference)
1280×1024 – Medium Detail, 0xAA
19
23.620
NVIDIA 210 512MB (ASUS)
1280×1024 – Medium Detail, 0xAA
30
40.354
Intel HD Graphics (Clarkdale)
1280×1024 – Low Detail, 0xAA
30
39.449

With performance as high as it was, there was no reason to lower the settings any further for a best playable. Going one step higher made the gameplay laggier, so there was no point. Because of the sheer number of graphic options found in the game, though, you could have a field day if you really wanted to fine-tune things.

Futuremark 3DMark Vantage

Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark Vantage is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.

The company first started out as MadOnion and released a GPU-benchmarking tool called XLR8R, which was soon replaced with 3DMark 99. Since that time, we’ve seen seven different versions of the software, including two major updates (3DMark 99 Max, 3DMark 2001 SE). With each new release, the graphics get better, the capabilities get better and the sudden hit of ambition to get down and dirty with overclocking comes at you fast.

Similar to a real game, 3DMark Vantage offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, High and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test.

Performance is the stock mode that most use when benchmarking, but it only uses a resolution of 1280×1024, which isn’t representative of today’s gamers. Extreme is more appropriate, as it runs at 1920×1200 and does well to push any single or multi-GPU configuration currently on the market – and will do so for some time to come.

3DMark Vantage is a highly scalable benchmark, taking full advantage of all available shaders and literal GPU cores, along with copious amounts of memory. Given that, our results above fairly accurately scale each card with its real-world performance.

Power & Temperatures

To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.

As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.

To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.

With its much larger heatsink, the reference card had the overall best temperatures, with Sapphire’s card coming in a distant second, with 5°C extra on the load. Gigabyte’s card fell even further behind, with 7°C tacked onto the load compared to the reference card. Though the temps are higher on the non-reference cards, there are definite benefits still to be reaped.

Gigabyte’s card performed the “worst” (it’s hard to consider it “worst” when even the max temps aren’t all too bad), but it also had the smallest cooler of all three. The highest point of the cooler almost matches the height of the PCI bracket, so this card will in no possible way become an inconvenience. Plus, despite it having a seemingly whiny-looking fan, it was barely audible even at max load. Sapphire’s card improved the temperatures just a wee but, but the cooler also takes up just a bit more space as well.

Despite Gigabyte’s card having a fan, I actually prefer that design above the others, simply because it’s the smallest package, and was still inaudible when under intense stress.

For power consumption, there’s barely a difference between them, even with differing clock speeds. The HD 5450 has officially become the most power-efficient card we’ve tested, with the reference card shaving 12W off the load wattage of the GeForce 210.

Final Thoughts

With our look at NVIDIA’s three lowest-end cards just two weeks ago, and our look at the HD 5450 today, and not to mention our upcoming look at the HD 5570, I can’t help but feel like I’ve stepped into some sort of low-end dimension. There is nothing wrong with budget cards, per se, but one thing’s for certain… I can’t wait for Fermi to get here so I can be greeted back to high-end resolutions!

AMD set out to defeat NVIDIA’s GeForce 210 with its HD 5450, and there’s no question at all that it has the superior card. The HD 5450 was more power efficient, ran cooler (even with its passive heatsink), delivered upwards of a 50% performance boost in most titles, and supports things we’ve come to like about the HD 5000 series, such as Eyefinity and DirectX 11. That’s right… if you wanted, you could power up to three separate displays with just one $50 HD 5450 (adapters would be required).

From a performance standpoint, the HD 5450 is easily the better card when comparing to the GeForce 210, as all of our tests have confirmed. This is something we expected, since NVIDIA’s card is based on aging technology. The tables may very-well turn once Fermi based budget cards get here, but those are a long way off. For now, AMD will dominate the market for $50 graphics cards.

Do note though, that while NVIDIA’s 210 retails for around $45, mail-in rebates are abound to bring the prices down by $20 in some cases. If gaming performance isn’t at all important, nor temps or other HD 5000 perks, that deal may be more appealing to you. You will not find as many passive designs on the 210, however.

ATI Radeon HD 5450 512MB

One thing we didn’t touch up on was HD playback, but that’s one thing AMD touts as excelling on its entire range of HD 5000 cards, and especially the HD 5450. The card features all of the goodies you’ve come to expect, such as support for dual decode, Dolby TrueHD and all the others. Where HD content is concerned, the HD 5450 simply isn’t lacking. It’s a very feature-rich card in a very small package.

The three models we took a look at today aren’t all too different from one another, but I’m so pleased with the overall model that I’m awarding them each an Editor’s Choice award. Each version has its perks, with Gigabyte’s being my favorite simply because it has the smallest cooler and is still incredibly quiet. Sapphire’s card is also really good, and shaves a few extra degrees off the temperature thanks to its slightly larger cooler. The best for cooling would be AMD’s reference, but I have a gut feeling no vendor is going to adopt it. I’m hoping I’m wrong.

Is the HD 5450 the best HTPC card ever produced? For $50 and all it offers, you be the judge. It’s sure not a gaming card by any stretch, but it does allow light gaming, and comes in an extremely small package. Please note that while not pictured, both Gigabyte’s and Sapphire’s card includes multiple low-profile replacement brackets if you don’t have the need to use VGA and are using a very, very small chassis.


ATI (AMD, Gigabyte & Sapphire) Radeon HD 5450

Discuss this article in our forums!

Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!

Copyright © 2005-2021 Techgage Networks Inc. - All Rights Reserved.