Date: September 14, 2012
Author(s): Rob Williams
With the release of its GeForce GTX 660, NVIDIA has delivered what we feel to be one of the most attractive Kepler offerings to date. It may be a step-down from the Ti edition released last month, but the GTX 660 delivers great performance across the board, and priced at $229, it won’t break the bank.
NVIDIA is good at many different things, but where the company truly excels is in its ability to fill every conceivable void in the GPU market. Have $100? There’s a card for you. Have $150? Ditto. Have $1,000? Can I have it?
Just a month ago, NVIDIA released its GeForce GTX 660 Ti, which was received to rather critical acclaim. Specs-wise, the card isn’t too far off from the GTX 670 which costs $100 more. But at $300, the card wasn’t exactly “affordable” by all standards. NVIDIA knew it had a duty to finally deliver a mainstream Kepler part as close to $200 as possible, and that’s resulted in the $229 non-Ti GTX 660.
The GTX 660 is equipped with 960 cores, vs. 1344 with the Ti. That comparison alone can give us an idea of what to expect here. Well, it would be easy if NVIDIA, in its usual way, didn’t give the core clock a nice boost on the non-Ti edition. Lesser cores, but +65MHz to the clock. An interesting move, and not one that anyone will complain about.
Memory density and general architecture layout remain similar between the two cards, although while the 660 Ti is based on the GK104 chip, this non-Ti version uses GK106. Whereas typical GPCs, or Graphics Processing Cluster, have two SMX units per, GK106 splits one right down the middle, as the following diagram shows:
This is an odd design, but won’t result in some sort of bottleneck as all SMX modules interface with their respective raster engine rather than each other.
Though this non-Ti edition of the GTX 660 drops the core count quite significantly, its increase in core frequency negates some of the additional power savings we would have seen. For that reason, the non-Ti is rated at 140W, vs. 150W of the Ti.
|Cores||Core MHz||Memory||Mem MHz||Mem Bus||TDP|
|GeForce GTX 690||3072||915||2x 2048MB||6008||256-bit||300W|
|GeForce GTX 680||1536||1006||2048MB||6008||256-bit||195W|
|GeForce GTX 670||1344||915||2048MB||6008||256-bit||170W|
|GeForce GTX 660 Ti||1344||915||2048MB||6008||192-bit||150W|
|GeForce GTX 660||960||980||2048MB||6000||192-bit||140W|
|GeForce GTX 650||384||1058||1024MB||5000||128-bit||64W|
|GeForce GT 640||384||900||2048MB||5000||128-bit||65W|
|GeForce GT 630||96||810||1024MB||3200||128-bit||65W|
|GeForce GT 620||96||700||1024MB||1800||64-bit||49W|
|GeForce GT 610||48||810||1024MB||1800||64-bit||29W|
Alongside the GTX 660, NVIDIA has also launched the GTX 650, although availability at this point is nil. Despite being briefed on both of these cards at the same time, I haven’t found a single review online of the GTX 650, so it’s to be assumed that NVIDIA isn’t rushing that product out too fast, and if I had to guess, it’s a card that exists only to finish off the sequential numbering system. Laugh all you want, but imagine the table above without the GTX 650. That’d look rather odd, wouldn’t it?
That aside, for NVIDIA to call it the GTX 650 is a bit of an insult to the GTX name. In no possible way does this GPU deserve it – GTX has traditionally represented cards that could more than handle games being run with lots of detail and at current resolutions. “GTX” is certainly suitable for the 660, but how did NVIDIA deem the 650 worthy when it slots just barely in front of the $100 GT 640? Maybe next we’ll see Porsche release a 4 cylinder 911 Turbo S.
Rant side, the vendor to provide us with a GTX 660 sample is GIGABYTE. Unfortunately, it’s an “OC Version”, which means we are unable to deliver baseline GTX 660 results (I am not keen on forcing turbo adjustments). Making matters a bit worse, the OC isn’t that minor. Memory remains the same, but the core gets +73MHz tacked on. An OC like this is great for consumers, but tough on reviewers who’d like to compare GPUs fairly.
We don’t normally include photos of a product box in our reviews, but something about the one with this GTX 660 struck me. It’s super-clean, simple, and nice to look at. I’m a sucker for blue and black.
With its Windforce cooler, GIGABYTE aims to greatly reduce the GTX 660’s temperatures vs. a reference cooler. At the same time, thanks to the larger fans, many heatpipes and finned design, the card should run a lot quieter as well.
For the sake of time, I didn’t tear the card apart for photos. But you can see both sides of the card below to gain an understanding of how the cooler is constructed.
We’ve been pleased with the Windforce cards we’ve taken a look at in the past, and this model looks to carry on its legacy. And with that, it’s time to tackle our test system, and then move right on into our testing.
At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a detailed look at how we conduct our testing.
The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used.
|Graphics Card Test System|
|Processors||Intel Core i7-3960X – Six-Core, 4.20GHz, 1.35v|
|Motherboard||GIGABYTE G1. Assassin 2 (X79)|
|Memory||Corsair Dominator GT 4x4GB – DDR3-2133|
|Graphics||AMD Radeon HD 7850 2GB (Catalyst 12.8)|
AMD Radeon HD 7970 3GB (Catalyst 12.7)
NVIDIA GeForce GTX 660 2GB (GeForce 306.23)
NVIDIA GeForce GTX 680 2GB (306.02)
|Audio||On-Board Creative X-Fi Audio|
|Storage||Kingston HyperX 240GB Solid-State Drive|
|Power Supply||Corsair AX1200|
|Chassis||Corsair Obsidian 700D Full-Tower|
|Cooling||Corsair H70 Liquid Cooler|
|Et cetera||Windows 7 Professional 64-bit|
When preparing our testbeds for any type of performance testing, we follow these guidelines:
To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows 7 from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing inaccurate test results. For example, disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.
The services we disable are:
Game settings will be tackled on the individual game pages.
Battlefield 3 is a rare treat when it comes to PC gaming. Rather than develop the game for the consoles first and then port over to the PC, DICE built the game with the PC in mind from the getgo. It’s graphically one of the most impressive games ever created, so it’s of little surprise that it finds itself in our testing.
Manual Run-through: Operation Guillotine (chapter 5) is the basis for our benchmarking here, as it features a lot of smoke, water, debris and is reliable to benchmark repeatedly. Our run starts us at the top of a hill, where we immediately rise up and run down it. We make our way down to the riverbed below, and end our run once we hit the first set of stairs.
For its price-point of $229, the GTX 660 delivers quite impressive performance in Battlefield 3. While 2560×1600 at max detail is a stretch, 1080p is just fine. Should you need a higher resolution, you’ll need to back down to medium or high resolution, depending, but there’s still quite a bit of brawn in this seemingly modest card.
For so many reasons, the DiRT series is one of the best out there for racing fans. Each game offers outstanding graphics and audio, excellent control and environments that are way too much fun to tear up. Showdown is an interesting beast, as it features destructive racing, but as we discovered in our review, it too is a ton of fun.
Manual Run-through: In our search for the perfect Showdown track to test with, we found that any of the snow levels offered the greatest stress on a GPU. The specific track we chose is the second race in the second tier, taking place in Colorado. We begin our FPS recording as soon as the countdown to the race begins, and end it as soon as we hit the finish line at the end of the three-lap race.
Whereas Battlefield 3 wasn’t too great at 2560 on the GTX 660, DiRT: Showdown is another story. All the way up to max detail, the game handled like a dream, dipping to a minimum of 41 FPS on one of the most graphically-intensive tracks in the entire game. Though not a perfect comparison, as 1440×900 x 3 monitors require the rendering of a similar number of pixels as 2560×1600, this game on multi-monitor with this GPU should be more than doable. At best, you might have to drop anti-aliasing, but that’s a small price to pay.
One of the more popular Internet memes for the past couple of years has been, “Can it run Crysis?”, but as soon as Metro 2033 launched, that’s a meme that should have died. Metro 2033 is without question one of the beefiest games on the market, and only just recently have GPUs been released that can allow the game to run in its DX11 mode at modest resolutions.
Manual Run-through: The level we use for testing is part of chapter 4, called “Child”, where we must follow a linear path through multiple corridors until we reach our end point, which takes a total of about 90 seconds. Please note that due to the reason mentioned above, we test this game in DX10 mode, as DX11 simply isn’t that realistic from a performance standpoint.
When a game can make the mighty GTX 680 whimper, what hope does a GTX 660 have? At 1680×1050, using DX11 at High detail, the game is perfect. But increasing the resolution quickly renders a problem (no pun). While 1080p might be doable for some, many are going to prefer to go the route of dropping to DX10. The game performance will significantly improve without sacrificing the visuals too much.
Of all the games we test with in our current suite, there is no other that’s likely to suck hundreds of hours out of your life than Skyrim. An expansive world, in-depth game mechanics, and the feeling that there’s always something to do… it’s no wonder the game has hit the right mark with so many people. While not the most graphically-intensive game, we like to test with it due to its popularity and the fact that it scales well in performance.
Manual Run-through: From the entry point in Markarth, our path leads us around the entire city, ultimately bringing us back to where we started.
Skyrim is the weakest game graphically in our line-up, so it’s of little surprise that the GTX 660 handles the game no problem, straight up to 2560×1600 and with max detail. As mentioned with DiRT: Showdown, multi-monitor with 1440×900 displays should be able to handle this game no problem on a GTX 660. That’s a big bang for the buck, right there.
Strategy games are well-known for pushing the limits of any system, and few others do this as well as Total War: SHOGUN 2. It fully supports DX11, has huge battlefields to overlook with hundreds or thousands of units, and a ton of graphics options to adjust. It’s quite simply a beast of a game.
Manual Run-through: While we normally dislike timedemos, because strategy games such as this are very difficult to benchmark reliably, we’ve opted to use the built-in benchmark instead.
Like DiRT: Showdown, AMD has optimized its drivers for SHOGUN 2, and it shows. But regardless of the GPU, it’s going to get a serious workout at 1080p if you plan on maxing out the detail settings. At 1680×1050, the GTX 660 holds a “just playable” FPS, while anything past that is out of the question. Still, when a game can prove to be an amazing test for our top-end GPUs here, there is only so much you can expect of a $229 offering. The performance seen here is good, but for 1080p and higher, you’ll want to moderate your settings a little bit.
Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark 11 is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.
Similar to a real game, 3DMark 11 offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test. The benchmark doesn’t natively support 2560×1600, so to benchmark with that, we choose the Extreme profile and simply change the resolution.
Through our normal benchmarks, both the GTX 660 and HD 7850 have scaled pretty much evenly, and 3DMark 11 backs all that up. In the best case, the GTX 660 will perform about 35% faster than the HD 7850 – not bad given the slim price difference.
While Futuremark is a well-established name where PC benchmarking is concerned, Unigine is just beginning to become exposed to people. The company’s main focus isn’t benchmarks, but rather its cross-platform game engine which it licenses out to other developers, and also its own games, such as a gorgeous post-apocalyptic oil strategy game. The company’s benchmarks are simply a by-product of its game engine.
The biggest reason that the company’s “Heaven” benchmark grew in popularity rather quickly is that both AMD and NVIDIA promoted it for its heavy use of tessellation, a key DirectX 11 feature. Like 3DMark Vantage, the benchmark here is overkill by design, so results here aren’t going to directly correlate with real gameplay. Rather, they showcase which card models can better handle both DX11 and its GPU-bogging features.
Much like every other test we’ve conducted here, Heaven 3.0 wages that the GTX 660 is quite a bit faster than the HD 7850 at any given resolution. With all we’ve seen so far, this is little surprise.
With all of the launches NVIDIA’s done in the past couple of months for Kepler, it was difficult to explore GTX 660 with anything more than minimal enthusiasm. However, from what we’ve seen throughout all of our testing, the GTX 660 is actually quite an impressive card, and possibly one of the most important to NVIDIA’s entire 600 series line-up.
The reason for that boils down to the affordable price-point, and the performance it delivers. A $229 card that can handle a graphically gorgeous game like Battlefield 3 at Ultra detail at 1080p? Do I really need to explain why that’s awesome?
Unfortunately, we didn’t have a GPU that was directly comparable to this one, so there’s really no apples to apples comparison. AMD’s Radeon HD 7850 comes closest, at about $200. In that match-up, we saw the GTX 660 consistently perform better than the HD 7850, with the lowest gains being seen in the heavily AMD-favored DiRT: Showdown. Performance increases of 10-20% were not uncommon. In the also AMD-favored SHOGUN 2, the GTX 660 averaged 50% faster.
Of course, I’d be remiss not to mention that our GTX 660 sample was overclocked, but even without the 7% core clock boost, the card is still substantially faster than the HD 7850 in most tests (if not for the clock boost, AMD would have likely matched NVIDIA in DiRT: Showdown).
That said, this launch won’t really affect AMD too much, although we may see another price-drop for the HD 7870 to align better against the GTX 660. With the holiday season right around the corner, we may see more price drops in the months ahead from both vendors.
Before deciding on a GPU, however, there’s something else to bear in mind: free games. At the moment, AMD is holding a huge promotion giving away the fantastic Sleeping Dogs (which we reviewed here) for free with a purchase of mainstream and higher cards, while NVIDIA is bundling Borderlands 2 with the GTX 660 Ti.
If you went the AMD route, you could pick up an HD 7850 for about $200, and then the bundle would be worth $250. On the NVIDIA side, if you are to purchase Borderlands 2 anyway, it’d make more sense at this point to splurge on the GTX 660 Ti, which retails for $300. It’s a $70 premium over the non-Ti version, but with your $60 game taken care of, it means you’ll really only be paying $10 for that good performance bump.
GIGABYTE GeForce GTX 660 2GB
Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper! There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!
Copyright © 2005-2020 Techgage Networks Inc. - All Rights Reserved.