by Rob Williams on February 9, 2010 in Graphics & Displays
AMD’s clear goal at the moment is to finish rounding-off its HD 5000-series line-up in advance of NVIDIA’s Fermi launch, and so far, it’s doing a good job. It’s continuing its success in this goal with the release of the $80 Radeon HD 5570, a card that’s designed to offer stellar media capabilities along with reasonable gaming performance.
To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.
As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.
To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.
Between the temperatures and power consumption, there are really no surprises. Given the rather simple cooler on the HD 5570, the temperature reports aren’t too impressive, although bear in mind that we stress the card to the best of our ability, and during your normal use, you should never see an 81°C core temperature. On the power side, the HD 5570 draws a bit more power than the GT 220, but it seems fair given the major performance difference.