After being hidden for the past eight years, NVIDIA has brought back the “Ti” name for its GeForce GTX 560 Ti. The GeForce4 Ti 4200, introduced in 2002, was a card that offered both great pricing and superb performance. NVIDIA looks to recreate the same sort of excitement with its GTX 560 Ti, but has it succeeded?
To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.
As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the desktop until things are completely idle. Because we are running such a highly optimized PC, this normally takes one or two minutes. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.
To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 15 minutes, which includes a one minute lull at the start, and a four minute lull at the end. After about 5 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.
In the case of dual-GPU configurations, we measure the temperature of the top graphics card, as in our tests, it’s usually the one to get the hottest. This could depend on GPU cooler design, however.
Note: Due to power-related changes NVIDIA made with the GTX 580 & GTX 570, we couldn’t run OCCT on that GPU. Rather, we had to use a run of the less-strenuous Heaven benchmark.
Somewhat interestingly, NVIDIA’s GeForce GTX 560 Ti runs just a bit hotter than the GTX 570, but that could be in part to thanks to the higher core clocks and smaller frame. It does offer one of the best idle temperatures we’ve seen, though, and at the top end, it didn’t exceed 76°C in our test. Overall, it looks to be a great option for HTPC gamer.
Comparing the GTX 560 Ti to the Radeon HD 6950 1GB, NVIDIA’s card has a better idle, but AMD’s has a better load – despite being a bit faster.