Latest News Posts

Social
Latest Forum Posts

ATI’s Radeon HD 5670 – DirectX 11 for $100
Bookmark and Share

ati_radeon_hd_5670_011410.jpg
Print
by Rob Williams on January 15, 2010 in AMD-Based GPU

AMD has delivered a couple of firsts over the past few months, and it’s keeping the tradition going with its release of the market’s first $100 DirectX 11-capable graphics card. Despite its budget status, the HD 5670 retains the HD 5000-series’ impressive power consumption and low idle temperatures, along with AMD’s Eyefinity support.

Power & Temperatures

To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.

As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.

To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.

The power consumption difference between the two cards is almost non-existent, but as I mentioned in the intro, the GPU cooler that Sapphire uses on its card is a big improvement over the reference, and the top graph shows just how much. There’s a truly staggering 25°C drop at full load, and an 11°C drop at idle. That’s probably the greatest decrease we’ve ever seen from an upgraded pre-installed GPU cooler, so that’s quite impressive.


  • Rocky Chavez

    damn my card sucks