Before we get into our final thoughts, let’s first take a look at both power consumption and temperatures. Because we haven’t put our entire collection of graphics cards through these particular tests, the results below are only for the two cards we’ve taken a look at today. Our next graphics card review will contain a more robust graph, and I’ll likely update this one as well when it’s completed.
To test the card for power draw, we first boot the machine up and let it sit idle in Windows for five minutes, at which point we make note of the wattage our Kill-a-Watt is reporting. At this point, GPU-Z is loaded up to capture the temperatures, and also OCCT 3.0. From here, the GPU stress-test inside of OCCT is run for an iteration of 15 minutes. During this stress, the highest wattage reached is recorded as the load. Once OCCT’s test is done, the average idle and top temperature are both recorded.
ATI might have come second in most of our performance-related tests, but it wins big time where power and temps are concerned. On the idle side of things, ATI’s card did keep a higher temperature, but it shaved about 12°C off of the GTX 275’s load temp. And although the figures aren’t quite the same, the findings with the power draws are near-identical – the ATI has a much higher idle, but much lower load. Given the performance gains of the GTX 275, though, it does seem reasonable that NVIDIA’s card is the power glutton of the two.
Well before I received either of these two cards, I had expected the match-up to be quite even, similar to the match-up we saw this past winter. As our test results have shown, though, that’s not the case. NVIDIA’s card came out on top in almost every-single one of the tests throughout the six games in our suite. Some of these do carry NVIDIA’s badge on the box, but not all. ATI’s name isn’t on any, as it’s become increasingly difficult to find a game with one. NVIDIA has been far more proactive in their marketing this way, and given that so many games seem to favor their cards lately, they’re no doubt offering more than just their logo for some box art.
From almost all perspectives, NVIDIA’s card is the clear winner. It proved faster in the vast majority of our tests, and also managed to deliver a more impressive top overclock. Normally, I’d blame ATI’s Overdrive utility for not allowing us to reach sky-high clocks, but that’s not the case here. While that utility allowed us to hit 1000MHz, our max stable Core clock proved to be 975MHz. If only both manufacturers offered voltage tweaking… then we may see some entirely different results.
So with NVIDIA dominating here, where’s that leave ATI? Well, it’s hard to say, but despite the fact that the GTX 275 won most rounds, the HD 4890 is far from being a loser. Compared to the HD 4870, it offers better performance, lower temperatures, improved power draw at idle, and unbelievable overclocking. Though the card we received came with a 900MHz Core (which we down-clocked to 850MHz for the sake of benchmarking), it settled in at 975MHz, which is 125MHz above stock. Not too shabby… not at all.
As it stands today, it’s hard to out-right recommend ATI’s card based on price, as both cards are supposed to retail for $249.99. AMD did say that most e-tailers would be offering a $20 mail-in rebate, so if you are into that sort of thing (I’m not), then that may weigh into your decision. Likewise, NVIDIA claims $249.99 pricing, but it’s hard to claim that without real product on the shelves. However, Newegg is listing the cards (out of stock) as $259.99 USD, so perhaps that’s what we can expect once the middle of the month hits and stock becomes a non-issue.
We took a deep look at what both companies are currently offering on page two and three of this article, but in the end, it’s up to you to decide what’s important to you. ATI offers DirectX 10.1 support, but there are few games that give us good reason to care, and most of the games aren’t too popular (or good, according to ratings). On the NVIDIA side, we have CUDA and PhysX, but again, there are few people who care about either right now, primarily due to the overall lack of support. At least on the NVIDIA side though, new announcements are being made all the time. Not too much is being unveiled regarding DX 10.1 or Stream (that we actually can or want to use, but again, many feel the same on the NVIDIA side).
Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!