by Rob Williams on June 28, 2010 in Graphics & Displays
To retain modest pricing, it’s common to see lower-end graphics cards equipped with either DDR2 or DDR3. That design choice, though, can have a major effect on performance, something that’s proven twice over with AMD’s Radeon HD 5550 and HD 5570, both of which have just been upgraded with a move to GDDR5 memory.
To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.
As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.
To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.
Given that we’re using reference cards, the temperature results here are not all too important since the card you purchase likely will feature a different cooler, but either way, the performance of AMD’s home-brewed cooler is good. Interestingly, Sapphire’s passive cooler on its own version of the HD 5550 is much improved over AMD’s model with a fan.
Performance-wise, GDDR5 made some healthy gains on both of our samples, but as we can see above, it also made a rather interesting increase in power consumption. The difference isn’t huge, but it’s clear that the GDDR5 IC’s draw more power than the DDR2 and DDR3 that the original card models used.