AMD HD 6950 1GB vs. NVIDIA GTX 560 Ti Overclocking

Print
by Rob Williams on February 2, 2011 in Graphics & Displays

AMD and NVIDIA released $250 GPUs last week, and both proved to deliver a major punch for modest cash. After testing, we found AMD to have a slight edge in overall performance, so to see if things change when OCing is brought into the picture, we pushed both cards hard, and then pit the results against our usual suite.

Page 10 – Power & Temperatures

To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.

As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the desktop until things are completely idle. Because we are running such a highly optimized PC, this normally takes one or two minutes. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.

To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 15 minutes, which includes a one minute lull at the start, and a four minute lull at the end. After about 5 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.

In the case of dual-GPU configurations, we measure the temperature of the top graphics card, as in our tests, it’s usually the one to get the hottest. This could depend on GPU cooler design, however.

Note: Due to power-related changes NVIDIA made with the GTX 580 & GTX 570, we couldn’t run OCCT on that GPU. Rather, we had to use a run of the less-strenuous Heaven benchmark.

For whatever reason, our GTX 560 Ti overclocked card was a bit more forgiving when we tested it for temperatures than the stock-clocked version was, as it reaches the top of the chart – a good thing in this case. AMD’s overclocked card experienced just a minor gain in heat.

Power-wise, NVIDIA’s card drew 21W more at load, and AMD’s drew 26W. Given the performance boosts seen, these bumps in power draw might not matter too much to you.

Support our efforts! With ad revenue at an all-time low for written websites, we're relying more than ever on reader support to help us continue putting so much effort into this type of content. You can support us by becoming a Patron, or by using our Amazon shopping affiliate links listed through our articles. Thanks for your support!

Rob Williams

Rob founded Techgage in 2005 to be an 'Advocate of the consumer', focusing on fair reviews and keeping people apprised of news in the tech world. Catering to both enthusiasts and businesses alike; from desktop gaming to professional workstations, and all the supporting software.

twitter icon facebook icon instagram icon