by Rob Williams on May 21, 2010 in Graphics & Displays
As our games continue to become even more robust, it would seem likely that having more memory available to the GPU would prove useful, but are we soon to see 2GB cards become commonplace? After many completed tests with Sapphire’s Radeon HD 5870 Vapor-X 2GB, we’re having a hard time settling on that.
To test our graphics cards for both temperatures and power consumption, we utilize OCCT for the stress-testing, GPU-Z for the temperature monitoring, and a Kill-a-Watt for power monitoring. The Kill-a-Watt is plugged into its own socket, with only the PC connect to it.
As per our guidelines when benchmarking with Windows, when the room temperature is stable (and reasonable), the test machine is boot up and left to sit at the Windows desktop until things are completely idle. Once things are good to go, the idle wattage is noted, GPU-Z is started up to begin monitoring card temperatures, and OCCT is set up to begin stress-testing.
To push the cards we test to their absolute limit, we use OCCT in full-screen 2560×1600 mode, and allow it to run for 30 minutes, which includes a one minute lull at the start, and a three minute lull at the end. After about 10 minutes, we begin to monitor our Kill-a-Watt to record the max wattage.
For temperatures, the 2GB card proved to be just a wee bit hotter, but that could have been compounded by the fact that the room temperature was 4°C warmer (unfortunately) as well. For power, the card happened to use less power at idle, but +24W at full load, which is shared between the boosted clocks and extra memory.