To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.
As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.
To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.
As I mentioned in the intro, the cooler on this card works well, and that’s proven here. No matter how long I ran OCCT, the temperature would never go above 46°C. Even after our overclock, we only managed to get the card up to 48°C. This is definitely HTPC material!
I find the idle wattage on the HD 5550 to be a little higher than I would expect, given it’s more than the HD 5570, but the card redeems itself at load, shaving off 14W.