During a press conference today at PAX East, NVIDIA officially launched its first Fermi/GF100-based GeForce GTX 470 and GTX 480 graphics cards. We’re still chugging away on our review, but are a bit behind due to some complications that arose during testing (which ironically have little to do with the GTX 480). For those who want to dive in and see what’s new now, there are many options available to you.
For the impatient, the review I’d recommend checking out is from our friend Nate at Legit Reviews. His opinions almost mirror mine perfectly, in that the GTX 480 doesn’t quite live up to the expectations held by many enthusiasts. The card for the most part beats out the HD 5870 in the majority of tests, but not to the degree that we’d like to see. In some cases, the HD 5870 can still manage to surpass the GTX 480′s performance.
The real kicker is a couple of the caveats that many of us were worried about. GF100 is seriously power-hungry… +100W above the Radeon HD 5870. Then there’s a temperature issue, where the GTX 480 can ramp up well past 90°C like it’s nothing. The most noticeable downside though is the card’s fan, which is loud. The room I’m in tends to have a lot of ambient noise, so I don’t usually care about GPU noise that much, but I couldn’t help but compare the GTX 480′s fan to a vacuum. It’s just that loud (~65 – 70dBA).
One area I haven’t given a good test with at this point is tessellation performance between the HD 5870 and GTX 480, but even if NVIDIA proves the superior, it’s going to be hard to sell the card based on that alone. After all, the GTX 480 has a couple notable downsides compared to the HD 5870 (noise, temps and power), but costs $100 more. Sure, it’s a bit more powerful in most games, but that’s a tough trade-off to make.
It’s also worth pointing out a flaw that Nate discovered when using the GTX 480 in a dual-LCD configuration. Believe it or not, going this route will almost pin the card’s temperature to 90°C and boost the power consumption by 80W… and that’s not even taking gaming into consideration. This is an issue that single-LCD users won’t have to worry about, but it does highlight a flaw in GF100 that doesn’t exist with NVIDIA’s earlier generations or ATI’s offerings.
I hate to feel so down on GF100, given we’ve been waiting so long for it, but I feel even worse for NVIDIA, since it appears that many of the issues that exist here are a result of things gone awry over the past couple of years. To cope with temperatures (rumor), NVIDIA had to decrease the core count from 512 to 480. If those original cores were able to remain, then a purchase would be made much easier, so it’s truly unfortunate that they were disabled… for whatever reason.
You can read our full review on Monday, but by this point you pretty much know what I’m going to say. I’ll make it up to you with a look at GF100′s other goodies in more detail, such as NVIDIA-supplied tech demos which happen to be fun as heck to use.
NVIDIA had numerous issues bringing the GF100 to the market and we know for a fact that the core has 512 CUDA cores, but they disabled 32 of them with a BIOS patch. We were told this was done to improve yields and also keep the cards thermal profile in check. If those cores were enabled that would give the GeForce GTX 480 nearly 7% more compute power and it would have really pulled away in the performance charts. If only NVIDIA could have pulled that off.