by Rob Williams on January 25, 2010 in Graphics & Displays
This past fall, NVIDIA filled out the remainder of its GT200 series of graphics cards with three models. For basic computing, there’s the $40 GeForce 210, while for those looking to get a bit of light gaming done, there’s the $60 GT 220. And to round things off, there’s the $90 GT 240, which handles all of today’s games rather well at 1080p.
To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.
As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.
To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.
NVIDIA’s cards might be low-end, but because they’re based on older architecture, the results above aren’t as impressive as they could be. Although, when you consider that the HD 5670 runs far hotter than the GT 240, it’s hard to ignore. AMD’s card does have a ridiculously simple cooler, however, while ASUS decked out its GT 240 with something a little more sensible. You can see above that our HD 5670 card at the top, which used a non-reference Sapphire cooler, improved the situation dramatically.
Power-wise, NVIDIA’s cards all rank the best, even the GT 240. It might be a bit slower than the HD 5670, but it seems to even out, as you can see above.