Latest News Posts

Social
Latest Forum Posts

AMD Radeon HD 6870 & HD 6850
Bookmark and Share

amd_radeon_hd_6870_101610_news.gif
Print
by Rob Williams on October 22, 2010 in AMD-Based GPU

It’s been a long time coming, but gamers can finally relax… AMD’s Radeon HD 6800 graphics cards are finally here. They may still be built upon a 40nm process, but AMD has brought a lot to the table here. We set out to see how the HD 6850 and HD 6870 compare to their closest competition, NVIDIA’s GeForce GTX 460 and GTX 470.

Power & Temperatures

To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.

As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the desktop until things are completely idle. Because we are running such a highly optimized PC, this normally takes one or two minutes. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.

To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 15 minutes, which includes a one minute lull at the start, and a four minute lull at the end. After about 5 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.

Temperature-wise, AMD’s latest cards don’t do too much to impress. I’m sure that with after-market coolers, we’d see some nice gains, but as it is, we’re nearing the 90°C mark, similar to NVIDIA’s current highest-end offerings. This isn’t a dangerous level per se, but I’d love to see the temperatures at least 10°C lower than this.

What they may lack in temperatures, they make up for in power consumption. We couldn’t really see any differences at idle, but at load, AMD’s latest offerings perform quite well – especially compared to NVIDIA.