Techgage logo

ASUS GeForce 210, GT 220 & GT 240

Date: January 25, 2010
Author(s): Rob Williams

This past fall, NVIDIA filled out the remainder of its GT200 series of graphics cards with three models. For basic computing, there’s the $40 GeForce 210, while for those looking to get a bit of light gaming done, there’s the $60 GT 220. And to round things off, there’s the $90 GT 240, which handles all of today’s games rather well at 1080p.



Introduction

Since November, we’ve taken a look at almost all of the current “budget” graphics cards available on the market, including AMD’s $130 Radeon HD 5750, and more recently, the company’s $100 Radeon HD 5670. Earlier this month, we went even lower-end with the help of Intel’s Clarkdale, which features its “HD Graphics” integrated graphics chip. There’s still a lot of track left for our budget train, so to help catch us up, we’re taking a look at the three lowest-end cards NVIDIA has to offer, all at once.

Of all the models available from NVIDIA’s current-gen (GT200) line-up, only three are available for under $100. These include the ~$40 210, which is so low-end, it doesn’t even deserve a GT, GS, or any other moniker. Moving up brings us to the GT 220, at around ~$60, and we finally reach the GT 240, which retails for around $90. All three of these cards can be had for even less if you take advantage of mail-in rebates.

Based on the pricing alone, you can pretty much jump to conclusions on their performance, and also which crowd they’re suited for. One obvious answer to that question is the HTPC crowd, but they’re also suitable for those who want graphics better than what an IGP can offer, but don’t want to spend over $100 on their cause. Currently, there are no models built with AMD’s current-gen architecture to compete with these NVIDIA cards, but they’re en route, and soon (by soon, I mean really soon).

Before we go too far into the article, it’s important to note that neither of these three models from NVIDIA are what most people would consider “new”, despite the fact that all three were released just this past fall. That might be the case, but over the past year or so, NVIDIA has become accustomed to re-releasing GPUs built on its previous-gen architecture, and sometimes, it goes back even further than just one generation. The 210 we’re looking at here? It’s based on G86/G98, also known as the 8400 GS. That card was released during the summer of 2007, to put things into perspective.

The architecture for the most part is the same, but the performance, and capabilities have been improved. The original 8400 GS was built on a 80nm process, for example, while the 210 is built on NVIDIA’s most recent 40nm process. Because of that improvement, the 210 is faster (thanks to boosted clocks) and will also see improved power efficiency, namely because of the die shrink. For interest’s sake, the 8400 GS consists of 210 million transistors packed into a 127mm^2 die, whereas the 210 consists of 260 million transistors and sees a much smaller die, at 57mm^2.

ASUS GeForce 210, GT 220 and GT 240

When each of the above cards were released, NVIDIA didn’t make it a point to sample them to too many sites, and for probably obvious reasons. The cards are all built on older architectures, and all are extremely low-end, so it’s hard to get too excited. But, I still wanted to get some in to take a look at, to help us, and our readers get a better understanding of all of the options out there. Huge thanks to ASUS for supplying us with all three models. It was a nice relief to not have to go out and hunt one card down from three different companies.

If you didn’t believe me up to this point that these three cards were all low-end, the photograph above surely helps me prove it. They’re simple, in both PCB design and with their cooling solutions. The 210 and GT 220 are particularly different, as they’re both low-profile, and not at all lengthy (~6.60″) and… perfect for HTPC use. The GT 240 is a bit beefier, but not by much. It’s much smaller than NVIDIA’s next step up, the GTS 250.

Model
Core MHz
Shader MHz
Mem MHz
Memory
Bus Width
Processors
GeForce GTX 295
576
1242
1000
1792MB
448-bit
480
GeForce GTX 285
648
1476
1242
1GB
512-bit
240
GeForce GTX 275
633
1404
1134
896MB
448-bit
240
GeForce GTX 260
576
1242
999
896MB
448-bit
216
GeForce GTS 250
738
1836
1100
1GB
256-bit
128
GeForce GT 240
550
1340
1700
512MB – 1GB
128-bit
96
GeForce GT 220
625
1360
790
1GB
128-bit
48
GeForce 210
589
1402
790
512MB
64-bit
16

After benchmarking all three of these cards, I can say right now that the 210 is going to be designed for few people. It’s a step-up from integrated graphics, but not by much. The GT 220 is a huge improvement. It might still be a $60 card, but moving up to it from a 210 is like stepping out of a Pinto and into a BMW. The GT 240 is the highest-performing of the bunch, as would be expected. It doesn’t quite come close to matching the performance of the GTS 250, but it’s still a solid offering for <$100.

With these three models, NVIDIA’s entire line-up has been evened out, and it’s unlikely that we’ll see any more models added to the GT200 series in the future. Fermi is right around the corner (~March), so hopefully our next NVIDIA graphics card review will be of a model based around that architecture.

It’s also important to note that of the cards we’re taking a look at here, each of them have options to use DDR2, GDDR3 or GDDR5. Our GT 240 card uses the latter, but the 210 and GT 220 models ASUS shipped stick with DDR2. So, please bear in mind that if you score one of these cards with DDR3, you’ll likely see slightly better performance than what’s seen in our results pages. For what it’s worth, I reckon DDR2 will be the most popular configuration for both of these GPUs, as prices will be kept at their lowest because of it.

ASUS GeForce GT 220 1GB

The card above, the GT 220, is just a quarter of an inch longer than the 210, and ASUS styled both of them to look almost identical. They feature a very simple aluminum cooler, with mini fan that surprisingly doesn’t make an audible whine during use. The reasons for this are no doubt due to the fact that the temperatures never get hot enough for the fan to feel the need to spin faster.

ASUS GeForce GT 240 512MB

Performance-wise, the GT 240 is far beyond the 210, so it’s no surprise to see a much larger PCB, and also a more robust cooler. It might be just a step under from the GTS 250, but a huge perk is that no PCI-E power connector is required. Just pop it in, and you’re good to go. You have got to love simplicity like that!

On the next page, we’ll tackle our testing methodology. I highly recommend you take a quick look through, as because we’re dealing with such low-end cards here, we’re deviating from our regular game configurations just a bit. After that, we’ll dive right into our results!

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing. For an exhaustive look at our methodologies, even down to the Windows Vista installation, please refer to this article.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used. Each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, it will bring you to the product on the manufacturer’s website.

Component
Model
Processor
Intel Core i7-975 Extreme Edition – Quad-Core, 3.33GHz, 1.33v
Motherboard
Gigabyte GA-EX58-EXTREME – X58-based, F7 BIOS (05/11/09)
Memory
Corsair DOMINATOR – DDR3-1333 7-7-7-24-1T, 1.60
ATI Graphics Radeon HD 5870 1GB (Sapphire) – Catalyst 9.10
Radeon HD 5850 1GB (ASUS) – Catalyst 9.10
Radeon HD 5770 1GB (Reference) – Beta Catalyst (10/06/09)
Radeon HD 5750 1GB (Sapphire) – Catalyst 9.11
Radeon HD 5670 512MB (Reference) – Beta Catalyst (12/16/09)
NVIDIA Graphics GeForce GTX 295 1792MB (Reference) – GeForce 186.18
GeForce GTX 285 1GB (EVGA) – GeForce 186.18
GeForce GTX 275 896MB (Reference) – GeForce 186.18
GeForce GTX 260 896MB (XFX) – GeForce 186.18
GeForce GTS 250 1GB (EVGA) – GeForce 186.18
GeForce GT 240 512MB (ASUS) – GeForce 196.21

GeForce GT 220 1GB (ASUS) – GeForce 196.21
GeForce 210 512MB (ASUS) – GeForce 196.21
Audio
On-Board Audio
Storage
Seagate Barracuda 500GB 7200.11
Power Supply
Corsair HX1000W
Chassis
SilverStone TJ10 Full-Tower
Display
Gateway XHD3000 30″
Cooling
Thermalright TRUE Black 120
Et cetera
Windows Vista Ultimate 64-bit

When preparing our testbeds for any type of performance testing, we follow these guidelines:

To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows Vista from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

For more robust information on how we tweak Windows, please refer once again to this article.

Game Titles

At this time, we currently benchmark all of our games using three popular resolutions: 1680×1050, 1920×1080 and also 2560×1600. 1680×1050 was chosen as it’s one of the most popular resolutions for gamers sporting ~20″ displays. 1920×1080 might stand out, since we’ve always used 1920×1200 in the past, but we didn’t make this change without some serious thought. After taking a look at the current landscape for desktop monitors around ~24″, we noticed that 1920×1200 is definitely on the way out, as more and more models are coming out as native 1080p. It’s for this reason that we chose it. Finally, for high-end gamers, we also benchmark using 2560×1600, a resolution that’s just about 2x 1080p.

For graphics cards that include less than 1GB of GDDR, we omit Grand Theft Auto IV from our testing, as our chosen detail settings require at least 800MB of available graphics memory. Also, if the card we’re benchmarking doesn’t offer the performance to handle 2560×1600 across most of our titles reliably, only 1680×1050 and 1920×1080 will be utilized.

Because we value results generated by real-world testing, we don’t utilize timedemos whatsoever. The possible exception might be Futuremark’s 3DMark Vantage. Though it’s not a game, it essentially acts as a robust timedemo. We choose to use it as it’s a standard where GPU reviews are concerned, and we don’t want to rid our readers of results they expect to see.

All of our results are captured with the help of Beepa’s FRAPS 2.98, while stress-testing and temperature-monitoring is handled by OCCT 3.1.0 and GPU-Z, respectively.

Special Note: Because we’re taking a look at very low-end cards here, we’ll be running the cards through special settings, not our usual set of three. Because the GT 240 is capable, it will be run through our regular 1680×1050 and 1080p settings, but both the 210 and GT 220 will be run through five of our six titles at 1024×768 and 1280×1024. These models will not be run through GRID, because the game isn’t enjoyable with them.

Call of Duty: Modern Warfare 2

Call of Juarez: Bound in Blood

Crysis Warhead

F.E.A.R. 2: Project Origin

Race Driver: GRID

World in Conflict: Soviet Assault

Call of Duty: Modern Warfare 2

When the original Call of Duty game launched in 2003, Infinity Ward was an unknown. Naturally… it was the company’s first title. But since then, the series and company alike have become household names. Not only has the series delivered consistently incredible gameplay, it’s pushed the graphics envelope with each successive release, and where Modern Warfare is concerned, it’s also had a rich storyline.

The first two titles might have been built on the already-outdated Quake III engine, but since then, the games have been built with improved graphical features, capable of pushing the highest-end PCs out there. Modern Warfare 2 is the first such exception, as it’s more of a console port than a true PC title. Therefore, the game doesn’t push PC hardware as much as we’d like to see, but despite that, it still looks great, and lacks little in the graphics department. You can read our review of the game here.

Manual Run-through: The level chosen is the 10th mission in the game, “The Gulag”. Our teams fly in helicopters up to an old prison with the intention of getting closer to finding the game’s villain, Vladimir Makarov. Our saved game file begins us at the point when the level name comes on the screen, right before we reach the prison, and it ends after one minute of landing, following the normal progression of the level. The entire run takes around two-and-a-half minutes.

To kick things off, the 210 proves that it is indeed faster than an IGP, at least Intel’s, but the GT 220 is the dominant model. It’s far faster than the 210, and it’s proven in the graph above.

Compared to AMD’s Radeon HD 5670, NVIDIA falls just a wee bit behind. Both of these cards can be directly compared performance-wise, as both retail for around the same price and target the same crowd. For what it’s worth, though, the HD 5670 isn’t being sold for under $100 by any e-tailer that I can see at the current time, but NVIDIA’s card can be had for $85, or even less with MIR’s. But, AMD does have the advantage of improved power efficiency, Eyefinity and DirectX 11… plus it’s seemingly faster, so both cards might come out even depending how heavily you weigh various factors.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5770 1GB CrossFireX
2560×1600 – Max Detail, 4xAA
40
81.311
ATI HD 5870 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
46
79.838
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA
37
68.563
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
41
66.527
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA
37
61.937
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA
33
53.314
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA
36
60.337
NVIDIA GTS 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA
30
53.253
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA
28
50.727
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA
24
43.96
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
30
53.139
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Low Detail, 0xAA
29
53.593
NVIDIA 210 512MB (ASUS)
1280×1024 – Low Detail, 0xAA
18
29.885
Intel HD Graphics (Clarkdale)
1280×1024 – Low Detail, 0xAA
14
25.955

The GT 240 is quite the heavyweight when compared to the others we’re looking at. It handled the game just fine at 1080p with anti-aliasing removed, delivering a full 53 FPS. The others, including Intel’s IGP, managed to handle the game decent with low details at 1280×1024. The GT 220 clearly offers the best experience, but the others will suffice if you’re not set on silky-smooth frame-rates.

Call of Juarez: Bound in Blood

When the original Call of Juarez was released, it brought forth something unique… a western-styled first-person shooter. That’s simply not something we see too often, so for fans of the genre, its release was a real treat. Although it didn’t really offer the best gameplay we’ve seen from a recent FPS title, its storyline and unique style made it well-worth testing.

After we retired the original title from our suite, we anxiously awaited for the sequel, Bound in Blood, in hopes that the series could be re-introduced into our testing once again. Thankfully, it could, thanks in part to its fantastic graphics, which are based around the Chrome Engine 4, and improved gameplay of the original. It was also well-received by game reviewers, which is always a good sign.

Manual Run-through: The level chosen here is Chapter I, and our starting point is about 15 minutes into the mission, where we stand atop a hill that overlooks a large river. We make our way across the hill and ultimately through a large trench, and we stop our benchmarking run shortly after we blow up a gas-filled barrel.

Once again, the GT 220 shows that it’s about 2x as fast as the 210, which is quite the feat given it costs just 50% more. Intel’s IGP, as mentioned in our launch article, can’t render Bound in Blood properly, so it’s of no competition.

The FPS difference between the HD 5670 and GT 240 in Modern Warfare 2 was obvious, but the GT 240 falls even further behind here. This isn’t much of a surprise, however, as Bound in Blood as always seemed to be tuned more towards ATI cards.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail
58
81.945
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail
37
80.339
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail
51
69.165
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail
45
54.428
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail
41
51.393
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail
28
45.028
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail
35
44.023
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail
27
38.686
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail
25
33.751
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail
38
47.23
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
29
39.446
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Medium Detail, 0xAA
29
41.722
NVIDIA 210 512MB (ASUS)
1280×1024 – Low Detail, 0xAA
18
30.825

Both the GT 220 and GT 240 managed to keep their respective top settings, while the 210 had to see the game’s detail lowered in order to gain ultimate playability. I should note that with low detail settings, this game is really ugly, so the GT 220 is the preferred minimum card here.

Crysis Warhead

Like Call of Duty, Crysis is another series that doesn’t need much of an introduction. Thanks to the fact that almost any comments section for a PC performance-related article asks, “Can it run Crysis?”, even those who don’t play computer games no doubt know what Crysis is. When Crytek first released Far Cry, it delivered an incredible game engine with huge capabilities, and Crysis simply took things to the next level.

Although the sequel, Warhead, has been available for just about a year, it still manages to push the highest-end systems to their breaking-point. It wasn’t until this past January that we finally found a graphics solution to handle the game at 2560×1600 at its Enthusiast level, but even that was without AA! Something tells me Crysis will be de facto for GPU benchmarking for the next while.

Manual Run-through: Whenever we have a new game in-hand for benchmarking, we make every attempt to explore each level of the game to find out which is the most brutal towards our hardware. Ironically, after spending hours exploring this game’s levels, we found the first level in the game, “Ambush”, to be the hardest on the GPU, so we stuck with it for our testing. Our run starts from the beginning of the level and stops shortly after we reach the first bridge.

I’m sure it will come as no surprise to anyone, but Crysis on a low-end graphics card is… brutal. Is that enough emphasis? On the GT 240, it was bearable, but on the others… well, during our run, people get blown up. I was wishing I was one of those people.

For the third time in a row, the GT 240 falls just behind the HD 5670. To be fair, AMD’s card is newer (released less than two weeks ago), but the GT 240 is NVIDIA’s only competition right now, and it’s likely to remain that way for quite some time.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Gamer, 0xAA
19
40.381
ATI HD 5870 1GB (Reference)
2560×1600 – Gamer, 0xAA
20
32.955
ATI HD 5850 1GB (ASUS)
2560×1600 – Mainstream, 0xAA
28
52.105
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
27
50.073
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Mainstream, 0xAA
24
47.758
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Mainstream, 0xAA
21
40.501
ATI HD 5770 1GB (Reference)
2560×1600 – Mainstream, 0xAA
20
35.256
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Mainstream, 0xAA
18
34.475
ATI HD 5750 1GB (Sapphire)
1920×1080 – Mainstream, 0xAA
21
47.545
ATI HD 5670 512MB (Reference)
1920×1080 – Mainstream, 0xAA
20
35.103
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Mainstream Detail, 0xAA
19
33.623
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Minimum Detail, 0xAA
26
44.286
NVIDIA 210 512MB (ASUS)
1280×1024 – Minimum Detail, 0xAA
15
29.501
Intel HD Graphics (Clarkdale)
1280×1024 – Minimum Detail, 0xAA
10
24.289

Surprisingly, even though the performance on NVIDIA’s two lowest-end cards was horrible, dropping the detail settings to Minimum, from Mainstream, made a world of difference. The game sure isn’t that good to look at with such low detail, but it ran quite well, and was very playable.

F.E.A.R. 2: Project Origin

Five out of the seven current games we use for testing are either sequels, or titles in an established series. F.E.A.R. 2 is one of the former, following up on the very popular First Encounter Assault Recon, released in fall of 2005. This horror-based first-person shooter brought to the table fantastic graphics, ultra-smooth gameplay, the ability to blow massive chunks out of anything, and also a very fun multi-player mode.

Three-and-a-half years later, we saw the introduction of the game’s sequel, Project Origin. As we had hoped, this title improved on the original where gameplay and graphics were concerned, and it was a no-brainer to want to begin including it in our testing. The game is gorgeous, and there’s much destruction to be had (who doesn’t love blowing expensive vases to pieces?). The game is also rather heavily scripted, which aides in producing repeatable results in our benchmarking.

Manual Run-through: The level used for our testing here is the first in the game, about ten minutes in. The scene begins with a travel up an elevator, with a robust city landscape behind us. Our run-through begins with a quick look at this cityscape, and then we proceed through the level until the point when we reach the far door as seen in the above screenshot.

At 1024×768, F.E.A.R. 2 was fairly playable on all of our above configurations, with 1280×1024 running well on the GT 220. The performance overall sure isn’t anything worth bragging about, but for such a great-looking game, the performance we saw wasn’t too bad.

The trend continues, with our GT 240 falling well behind the HD 5670. F.E.A.R. 2, like Call of Juarez, seems to favor AMD, but the differences here are still rather stark even so.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
45
95.767
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
65
91.34
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
51
73.647
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
39
62.014
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
37
57.266
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA, 16xAF
29
48.110
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
31
47.411
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 16xAF
27
39.563
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 4xAA, 16xAF
24
36.331
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA, 16xAF
31
46.87
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
30
45.039
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Medium Detail, 0xAA
22
29.869
NVIDIA 210 512MB (ASUS)
1280×1024 – Low Detail, 0xAA
17
28.569
Intel HD Graphics (Clarkdale)
1280×1024 – Low Detail, 0xAA
20
34.388

Like some of the previous titles, the GT 240 performed well enough to hold onto the same settings we used for testing above, while the rest of the cards had to see notched-down settings before we could consider the game to be playable.

Race Driver: GRID

If you primarily play games on a console, your choices for quality racing games are plenty. On the PC, that’s not so much the case. While there are a good number, there aren’t enough for a given type of racing game, from sim, to arcade. So when Race Driver: GRID first saw its release, many gamers were excited, and for good reason. It’s not a sim in the truest sense of the word, but it’s certainly not arcade, either. It’s somewhere in between.

The game happens to be great fun, though, and similar to console games like Project Gotham Racing, you need a lot of skill to succeed at the game’s default difficulty level. And like most great racing games, GRID happens to look absolutely stellar, and each of the game’s locations look very similar to their real-world counterparts. All in all, no racing fan should ignore this one.

Manual Run-through: For our testing here, we choose the city where both Snoop Dogg and Sublime hit their fame, the LBC, also known as Long Beach City. We choose this level because it’s not overly difficult, and also because it’s simply nice to look at. Our run consists of an entire 2-lap race, with the cars behind us for almost the entire race.

The performance between the HD 5670 and GT 240 isn’t as stark here as it has been in a couple of the tests, but it’s still rather evident. Given we’re dealing with a low number of frames to begin with, the difference of just 5 FPS is somewhat noticeable in the real-world.

Graphics Card
Best Playable
Min FPS
Avg. FPS
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA
87
106.43
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 4xAA
84
103.958
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA
68
84.732
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
54
66.042
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 4xAA
52
63.617
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 4xAA
45
56.980
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 4xAA
45
54.809
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 4xAA
39
47.05
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 4xAA
35
43.663
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 4xAA
36
47.36
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
33
51.071

The difference between the performance of the HD 5670 and GT 240 in this title isn’t huge, but I felt it played much better without anti-aliasing, so that became our best playable.

World in Conflict: Soviet Assault

I admit that I’m not a huge fan of RTS titles, but World in Conflict intrigued me from the get go. After all, so many war-based games continue to follow the same story-lines we already know, and WiC was different. It counteracts the fall of the political and economic situation in the Soviet Union in the late 80’s, and instead provides a storyline that follows it as if the USSR had succeeded by proceeding with war in order to remain in power.

Many RTS games, with their advanced AI, tend to favor the CPU in order to deliver smooth gameplay, but WiC favors both the CPU and GPU, and the graphics prove it. Throughout the game’s missions, you’ll see gorgeous vistas and explore areas from deserts and snow-packed lands, to fields and cities. Overall, it’s a real visual treat for the eyes – especially since you’re able to zoom to the ground and see the action up-close.

Manual Run-through: The level we use for testing is the 7th campaign of the game, called Insurgents. Our saved game plants us towards the beginning of the mission with two squads of five, and two snipers. The run consists of bringing our men to action, and hovering the camera around throughout the duration. The entire run lasts between three and four minutes.

The GT 220 continues to flaunt its superiority, while both the GT 210 and Intel’s IGP duke it out nicely, with NVIDIA’s card consistently coming out on top.

The results above are the closest the GT 240 has come to the HD 5670, so much so, that a gamer would never notice the difference between the two.

Graphics Card
Best Playable
Min FPS
Avg. FPS
NVIDIA GTX 295 1792MB (Reference)
2560×1600 – Max Detail, 8xAA, 16xAF
40
55.819
ATI HD 5870 1GB (Reference)
2560×1600 – Max Detail, 4xAA, 16xAF
35
47.195
ATI HD 5850 1GB (ASUS)
2560×1600 – Max Detail, 4xAA, 16xAF
29
40.581
NVIDIA GTX 285 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 16xAF
34
49.514
NVIDIA GTX 275 896MB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
36
46.186
NVIDIA GTX 260 896MB (XFX)
2560×1600 – Max Detail, 0xAA, 16xAF
23
39.365
ATI HD 5770 1GB (Reference)
2560×1600 – Max Detail, 0xAA, 16xAF
28
37.389
NVIDIA GTX 250 1GB (EVGA)
2560×1600 – Max Detail, 0xAA, 4xAF
24
32.453
ATI HD 5750 1GB (Sapphire)
2560×1600 – Max Detail, 0xAA, 4xAF
23
31.769
NVIDIA GT 240 512MB (ASUS)
1920×1080 – Max Detail, 0xAA
22
33.788
ATI HD 5670 512MB (Reference)
1920×1080 – Max Detail, 0xAA, 16xAF
21
31.872
NVIDIA GT 220 1GB (ASUS)
1280×1024 – Medium Detail, 0xAA
41
52.089
NVIDIA 210 512MB (ASUS)
1280×1024 – Medium Detail, 0xAA
19
23.620
Intel HD Graphics (Clarkdale)
1280×1024 – Low Detail, 0xAA
30
39.449

This game might look simple on the surface, but it’s hardcore on graphics. The 210 and GT 220 handled the game fine at 1280×1024, although the 210 was borderline. No one would disagree that 23 FPS is low, but in this particular title, I still deemed it playable enough to warrant the retaining of the settings.

Futuremark 3DMark Vantage

Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark Vantage is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.

The company first started out as MadOnion and released a GPU-benchmarking tool called XLR8R, which was soon replaced with 3DMark 99. Since that time, we’ve seen seven different versions of the software, including two major updates (3DMark 99 Max, 3DMark 2001 SE). With each new release, the graphics get better, the capabilities get better and the sudden hit of ambition to get down and dirty with overclocking comes at you fast.

Similar to a real game, 3DMark Vantage offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, High and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test.

Performance is the stock mode that most use when benchmarking, but it only uses a resolution of 1280×1024, which isn’t representative of today’s gamers. Extreme is more appropriate, as it runs at 1920×1200 and does well to push any single or multi-GPU configuration currently on the market – and will do so for some time to come.

3DMark Vantage is a highly scalable benchmark, taking full advantage of all available shaders and literal GPU cores, along with copious amounts of memory. Given that, our results above fairly accurately scale each card with its real-world performance.

Power & Temperatures

To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.

As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.

To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.

NVIDIA’s cards might be low-end, but because they’re based on older architecture, the results above aren’t as impressive as they could be. Although, when you consider that the HD 5670 runs far hotter than the GT 240, it’s hard to ignore. AMD’s card does have a ridiculously simple cooler, however, while ASUS decked out its GT 240 with something a little more sensible. You can see above that our HD 5670 card at the top, which used a non-reference Sapphire cooler, improved the situation dramatically.

Power-wise, NVIDIA’s cards all rank the best, even the GT 240. It might be a bit slower than the HD 5670, but it seems to even out, as you can see above.

Final Thoughts

It’s not all too often the opportunity arises where I can clear out three different GPU models at once, but I’m glad I was able to do it with these, as it helps us sum up NVIDIA’s <$100 line-up very easily. Currently, the 210 retails for around $40, but if you look around, you can find some for sale with mail-in rebates that will bring the total down to $30, or even less. But, I’m not sure even at that price I’d recommend it. Unless, of course, you just needed a GPU.

The 210 is about as low-end as you could possibly go, and if I was writing this review two months ago, I would have recommended it to anyone who needs any sort of GPU to get their rig up and running. If you have a broken PC, or need to upgrade an old one, then it’s still a great choice. But for a new PC build, I’d quicker recommend just picking up an Intel Clarkdale CPU, because the integrated graphics is going to suffice for most people.

That might seem a bit odd to say, but the truth is, the 210 just wasn’t that impressive, and overall, it wasn’t even that much faster than Intel’s IGP. I’d have to assume that people who are looking at a 210 have no intentions of gaming, so in that case, Intel’s Clarkdale CPUs are fine. Not to mention, those would also deliver far superior power consumption, and temperatures. But as mentioned, if you simply need a GPU to get your borked machine back up and running, it’s going to be fine for the money.

ASUS GeForce GT 240 512MB

The GT 220 is still a very low-end card, but compared to the 210, the performance is like night and day. Most often, the GT 220 showed 2x the performance of the 210, and sometimes even more. Given that the average price is $60 ($50 with MIR), you’d essentially be paying 50% more money for 200% of the performance. That, to me, is well worth it. The GT 220 doesn’t blow the doors open to run games at higher resolutions, but for those gamers running 1280×1024, it should get you by just fine. Just don’t expect to go higher than medium detail settings in most of today’s games.

Finally, there’s the GT 240, which retails for $90 and can be had for as low as $70 after a MIR. If I was given this card a month ago, I would have been more impressed than I am now. The reason is AMD’s Radeon HD 5670. Throughout all of our tests, that card proved to be faster, and in some cases, much faster. Then on top of it, there are features such as DirectX 11 and Eyefinity. AMD’s card costs more, but it’s a well-deserved premium.

That’s not to say that the GT 240 isn’t worth looking at. If you don’t care about DirectX 11 or Eyefinity, and to be honest, both are rather non-important for <$100 cards as far as I’m concerned, then the GT 240 might still be worth looking at. It might be slower than the HD 5670, but if you can score one for around $70 as I mentioned above, then you might prefer the savings over the extra performance. But, if you don’t want to deal with MIR’s, then the HD 5670 is hands-down the better choice. Even if you don’t take advantage of the extra features, the beefed up performance is well worth the small premium.

Discuss this article in our forums!

Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!

Copyright © 2005-2019 Techgage Networks Inc. - All Rights Reserved.