Techgage logo

The Almost Titan: NVIDIA GeForce GTX 780 Review

Date: May 23, 2013
Author(s): Rob Williams

When Titan released in February, it seemed likely that NVIDIA’s GeForce 700 series would be held off on for some time. Well, with today’s launch of the company’s mini-Titan – ahem, GTX 780, we’ve been proven wrong. Compared to the GTX 680, it offers a lot of +50%’s – but is it worth its $649 price tag?



Introduction

Around this time every year, a ton of action-packed movies hits the cinema that helps make summer taste just a little bit sweeter. This year, it’s Fast & Furious 6 that’s on my radar. The speed… the fun, the hot women. Actually, that reminds me of something: a new top-end graphics card, much like NVIDIA’s GeForce GTX 780, priced at $649.

Is it fast? You bet. Is it fun? Of course. Does it get you hot women? In my limited experience, I can assuredly say that no it does not. But for those lonely nights, there’s gaming, and gaming is better with a top-end graphics card.

You’ll have to excuse me – I’m a bit wired. I’ve been testing the GTX 780 and other select cards for the past week, and as usual, time hasn’t been kind. Mere days prior to receiving NVIDIA’s latest, our 30″ testing monitor perished, meaning our usual 2560×1600 testing was out. That’s a major issue when we’re dealing with ~$500 GPUs.

As a direct result, this article is going to be structured a little different than usual. Instead of benchmarking with our three usual resolutions of 1680×1050, 1920×1080 and 2560×1600, we’re instead using 1920×1080 and two multi-monitor resolutions: 4098×900 and 5760×1080. Multi-monitor testing is something we’ve wanted to implement for a while, but I was hoping to hold it off for just a couple of more weeks. In the future, we’ll benchmark 4800×900 instead of 4098×768, as during testing, I realized it was a more suitable resolution.

NVIDIA GeForce GTX 780

*takes another sip of coffee*

Back in February, NVIDIA released its GeForce GTX Titan, the monster $1,000 GPU based on the hardware that powers the massive Titan supercomputer. Because of its unique design compared to NVIDIA’s 600 series, it seemed like Titan was in fact part of the 700 series, despite its non-numerical nomenclature. While that’s not true as far as the proper product-line goes, the GTX 780 NVIDIA’s introducing today is based on the same GK110 architecture as Titan. Therefore, in effect, the GTX 780 is Titan, but with fewer cores and half the GDDR5.

NVIDIA GeForce SeriesCoresCore MHzMemoryMem MHzMem BusTDP
GeForce GTX Titan26888376144MB6008384-bit250W
GeForce GTX 78023048633072MB6008384-bit250W
GeForce GTX 69030729152x 2048MB6008256-bit300W
GeForce GTX 680153610062048MB6008256-bit195W
GeForce GTX 67013449152048MB6008256-bit170W
GeForce GTX 660 Ti13449152048MB6008192-bit150W
GeForce GTX 6609609802048MB6000192-bit140W
GeForce GTX 650 Ti BOOST7689802048MB6008192-bit134W
GeForce GTX 650 Ti7689251024MB5400128-bit110W
GeForce GTX 65038410581024MB5000128-bit64W
GeForce GT 6403849002048MB5000128-bit65W
GeForce GT 630968101024MB3200128-bit65W
GeForce GT 620967001024MB180064-bit49W
GeForce GT 610488101024MB180064-bit29W

Compared to the GTX 680, the GTX 780 has 50% more cores, 50% more memory, a 50% more capable memory bus and a 1,000% better-looking GPU cooler. Another increase is one we all hate to see, but it had to happen: the TDP has been boosted from 195W to 250W.

Because of the tight embargo timing I’m facing at the moment (which isn’t NVIDIA’s fault, mind you), I’m going to speed through some of these GTX 700 features more than I’d like. In the near-future, I’ll dedicate a quick article to GeForce Experience, as I believe it deserves one (and there are a couple of things coming that are cool.)

NVIDIA GeForce GTX 780 Front View
Fast, and heavy. No, that’s not plastic.

As seen in the above image, the GTX 780 shares the exact same GPU cooler found on Titan. While I’m indifferent to most coolers or GPU chassis in general, this one is about as badass as it gets. It’s designed to be as efficient as possible, keeping the GPU performing at its best without exceeding 80°C. Like the Titan, the GTX 780 includes GPU Boost 2.0, which unlike the original iteration focuses on adjusting the GPU clock based on temperature, not voltage. This is a design NVIDIA would agree should have existed from the start, as heat is a killer of electronics. Out-of-the-box, you should never see your GTX 780 exceed 80°C – as we’ll see in our temperature results later.

More cores is nice, but what helps set the GTX 780 apart from the GTX 680 is that it bumps the memory bus up to 384-bit (6 x 64-bit controllers), allowing games a bit more room to exercise – further helped by the increase of memory from 2GB to 3GB. While a bump of the memory seems minor, it’s actually rather important and I think it should have happened earlier – AMD has been a spearhead in this regard. While I haven’t found immediate proof of a game with a hunger for graphical RAM that will exceed 2GB (not even Metro 2033), there’s little doubt that we’ll reach that threshold soon.

NVIDIA GeForce GTX 780 Performance Upgrade

In the table above, NVIDIA showcases the performance improvements that its top-end GPUs have experienced since the GTX 580. Simply put, NVIDIA says that the GTX 780 is 70% more capable than the GTX 580, and 33% more capable than last year’s GTX 680.

Compared to the HD 7970 GHz, considerable gains can also be seen:

NVIDIA GeForce GTX 780 Performance Upgrade Versus 7970 GHz Edition

Taking into consideration the games we use for our testing in this review, I’d say that these boosts seem fairly realistic, though as we’ll see later, there will be some occasions where a game will favor AMD’s GPU more, allowing it to catch up to or in one or two cases even surpass the GTX 780.

Given the GTX 780 is similar in design to Titan, there’s not too much else to talk about, although it’s worth noting that NVIDIA continues to tout its ability to offer multi-GPU users excellent scaling. In its testing, ~+55% is the minimum improvement you could expect (based on Batman: Arkham City), while +75% looks to be the norm (at least in the games NVIDIA tests… which are of course going to favor its cards).

For those who like to see a card’s naughty bits, don’t let me be the one to deprive you:

NVIDIA GeForce GTX 780 PCB
(Full image is 2500×1143.)

As mentioned earlier, this article is all over the place due to our testing monitor dying and the subsequent extra time that was required as a result. I couldn’t test all of the games I wanted to (Far Cry 3, Crysis 3 and Hitman: Absolution, namely), but in addition to our usual six, I was able to test using Borderlands 2, Metro: Last Light and also BioShock Infinite.

Each of the games tested for this article were run at 1080p for single-monitor and 4098×768 and 5760×1080 for multi-monitor. While comparing the number of megapixels between multi-monitor resolutions and a single-monitor one shouldn’t be done, 1080p clocks in at about 2 megapixels; 4098×768 at 3 megapixels, and 5760×1080 at 6 megapixels. Because AMD’s Catalyst Control Center doesn’t allow custom resolutions, I couldn’t include the HD 7970 GHz in our 4098×768 testing.

Since it’s been months since I last tested the GTX 680, I re-benchmarked it again for this article using the current 320.14 beta GeForce drivers. But there’s a twist: I left the old results using the 306.38 driver intact  This is meant to show the scaling that can occur from driver improvements alone, which I’ll be talking a bit about in our conclusion.

With that all tackled, let’s move onto our testing methodology and then get right into our results, starting with Battlefield 3.

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a detailed look at how we conduct our testing.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the graphics card. Each card used for comparison is also listed here, along with the driver version used.

 Graphics Card Test System
ProcessorsIntel Core i7-3960X – Six-Core, 4.20GHz, 1.35v
MotherboardGIGABYTE G1. Assassin 2 (X79)
MemoryCorsair Dominator GT 4x4GB – DDR3-2133
GraphicsAMD Radeon HD 7750 1GB (Catalyst 12.9)
AMD Radeon HD 7770 1GHz Edition (Catalyst 12.9)
AMD Radeon HD 7790 1GB (Catalyst March 13, 2013)
AMD Radeon HD 7970 GHz Edition 3GB (Catalyst 13.4)
NVIDIA GeForce GT 640 1GB (GeForce 306.23)
NVIDIA GeForce GTX 650 Ti 1GB (GeForce 306.38)
NVIDIA GeForce GTX 650 Ti BOOST 2GB (GeForce 314.22)
NVIDIA GeForce GTX 660 2GB (GeForce 306.23)
NVIDIA GeForce GTX 680 2GB (GeForce 320.14)
NVIDIA GeForce GTX 780 3GB (GeForce 320.18)
AudioOn-Board Creative X-Fi Audio
StorageKingston HyperX 240GB Solid-State Drive
Power SupplyCorsair AX1200
ChassisCorsair Obsidian 700D Full-Tower
CoolingCorsair H70 Liquid Cooler
Et ceteraWindows 7 Professional 64-bit

When preparing our testbeds for any type of performance testing, we follow these guidelines:

General Guidelines

To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows 7 from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing inaccurate test results. For example, disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

The services we disable are:

Battlefield 3

Battlefield 3

DiRT: Showdown

DiRT: Showdown

DiRT: Showdown DiRT: Showdown

Metro 2033

Metro 2033

Sleeping Dogs

Sleeping Dogs

The Elder Scrolls V: Skyrim

The Elder Scrolls V: Skyrim

Total War: SHOGUN 2

Total War: SHOGUN 2

Game Tests: Battlefield 3, DiRT: Showdown

Battlefield 3 is a rare treat when it comes to PC gaming. Rather than develop the game for the consoles first and then port over to the PC, DICE built the game with the PC in mind from the get-go. It’s graphically one of the most impressive games ever created, so it’s of little surprise that it finds itself in our testing.

Battlefield 3

Manual Run-through: Operation Guillotine (chapter 5) is the basis for our benchmarking here, as it features a lot of smoke, water, debris and is reliable to benchmark repeatedly. Our run starts us at the top of a hill, where we immediately rise up and run down it. We make our way down to the riverbed below, and end our run once we hit the first set of stairs.

NVIDIA GeForce GTX 780 - Battlefield 3 (1920x1080)

NVIDIA GeForce GTX 780 - Battlefield 3 (1680x1050)

NVIDIA GeForce GTX 780 - Battlefield 3 (5760x1080)

What’s not too surprising here is seeing NVIDIA’s GTX 780 in front. What is, is seeing AMD’s HD 7970 GHz right behind it. When that card first launched, it didn’t perform quite this well, so it just goes to show how much improvement can be seen with a GPU over time based on drivers alone. It’s even more effective than overclocking, if not even more so.

In both of our multi-monitor tests, the GTX 780 outperforms the GTX 680 by 30%; at 1080p, it shows a 27% improvement.

DiRT: Showdown

For so many reasons, the DiRT series is one of the best out there for racing fans. Each game offers outstanding graphics and audio, excellent control and environments that are way too much fun to tear up. Showdown is an interesting beast, as it features destructive racing, but as we discovered in our review, it too is a ton of fun.

DiRT: Showdown

Manual Run-through: In our search for the perfect Showdown track to test with, we found that any of the snow levels offered the greatest stress on a GPU. The specific track we chose is the second race in the second tier, taking place in Colorado. We begin our FPS recording as soon as the countdown to the race begins, and end it as soon as we hit the finish line at the end of the three-lap race.

NVIDIA GeForce GTX 780 - DiRT: Showdown (1920x1080)

NVIDIA GeForce GTX 780 - DiRT: Showdown (1680x1050)

NVIDIA GeForce GTX 780 - DiRT: Showdown (5760x1080)

Aha! The first title to rain on NVIDIA’s parade is DiRT: Showdown. Across both of the resolutions that include the HD 7970, it either keeps close to the GTX 780 or surpasses it – both times, ever-so-slightly. Given this is a game with AMD’s logo all over it, it’s of little surprise that the HD 7970 is as good as it is here. We’ll see the favor returned soon with games that have NVIDIA badges on them. Can’t we just get some neutral gaming around here?

Game Tests: Metro 2033, Sleeping Dogs

One of the more popular Internet memes for the past couple of years has been, “Can it run Crysis?”, but as soon as Metro 2033 launched, that’s a meme that should have died. Metro 2033 is without question one of the beefiest games on the market, and only just recently have GPUs been released that can allow the game to run in its DX11 mode at modest resolutions.

Metro 2033

Manual Run-through: The level we use for testing is part of chapter 4, called “Child”, where we must follow a linear path through multiple corridors until we reach our end point, which takes a total of about 90 seconds. Please note that due to the reason mentioned above, we test this game in DX10 mode, as DX11 simply isn’t that realistic from a performance standpoint.

NVIDIA GeForce GTX 780 - Metro 2033 (1920x1080)

NVIDIA GeForce GTX 780 - Metro 2033 (5760x1080)

We’ve been benchmarking with Metro 2033 for nearly three years, and for most games, that’d be ridiculous. But as you can see, the game is still hardcore on today’s best GPUs. It allows the GTX 780 to churn away at a mere 86 FPS at 1080p. That’s a good value, sure, but again, this is a three-year-old game, and this is High detail – not Very High. Soon, we’ll likely replace this game with Metro: Last Light, as it also manages to punish today’s hardware quite well, and will likely do so for a while.

Here, NVIDIA’s GTX 780 keeps clear ahead of everything else, exhibiting a staggering 55% performance increase at 1080p versus the GTX 680. It also shows a 23% increase over AMD’s HD 7970 GHz. 

At 5760×1080 (the game would not run at 4098×768 for some reason), the GTX 680 and HD 7970 go head-to-head. Seriously, just look at those results. If I have one thing to say about this, it’s “dat 384-bit”. It’s important to note that while the 1080p result was captured with our usual manual benchmark, the 5760×1080 used the built-in benchmark for the sake of ease (not only would the game refuse to run at 4098×768, it also gave me hassle when trying to change the in-game settings at all while using multi-monitor).

Sleeping Dogs

Many have called Sleeping Dogs the “Asian Grand Theft Auto“, but the game does a lot different that helps it stand out of the crowd. In lieu of supplying the player with a gazilion guns, Sleeping Dogs focuses heavily on hand-to-hand combat. There are also many collectibles that can be found to help upgrade your character and unlock special fighting abilities – and if you happen to enjoy an Asian atmosphere, this is one tree you’ll want to bark up.

Sleeping Dogs

Manual Run-through: Our run here takes place during the chapter “Amanda”, on a dark, dank night. Our saved game begins us at the first apartment in the game (in North Point), though that’s not where we begin capturing our framerate. Instead, we walk outside and request our motorcycle from the garage. Once set, we begin recording framerates and drive along a specific path all the way to Aberdeen, which takes about two minutes.

NVIDIA GeForce GTX 780 - Sleeping Dogs (1920x1080)

NVIDIA GeForce GTX 780 - Sleeping Dogs (1680x1050)

NVIDIA GeForce GTX 780 - Sleeping Dogs (5760x1080)

At 1080p, the HD 7970 GHz once again managed to outperform NVIDIA’s latest GPU, but things changed back toward’s NVIDIA’s favor in multi-monitor. Continuing the theme, the GTX 780 storms past the GTX 680, with a 25.5% boost.

Game Tests: The Elder Scrolls V: Skyrim, Total War: SHOGUN 2

Of all the games we test with in our current suite, there is no other that’s likely to suck hundreds of hours out of your life than Skyrim. An expansive world, in-depth game mechanics, and the feeling that there’s always something to do… it’s no wonder the game has hit the right mark with so many people. While not the most graphically-intensive game, we like to test with it due to its popularity and the fact that it scales well in performance.

The Elder Scrolls V: Skyrim

Manual Run-through: From the entry point in Markarth, our path leads us around the entire city, ultimately bringing us back to where we started.

NVIDIA GeForce GTX 780 - The Elder Scrolls V: Skyrim (1920x1080)

NVIDIA GeForce GTX 780 - The Elder Scrolls V: Skyrim (1680x1050)

NVIDIA GeForce GTX 780 - The Elder Scrolls V: Skyrim (5760x1080)

Much like how DiRT: Showdown tends to run quite well on AMD hardware, Skyrim is much the same for NVIDIA, as is proven here. Despite the HD 7970 GHz keeping up to the GTX 780 well in most of our other tests, it falls behind even the GTX 680 here – except at 5760×1080 (and likely 4098×768 if I were able to run that resolution on the HD 7970). It’s certainly not the best game to bench with given how light it is on the GPU, but Skyrim still scales quite well.

Total War: SHOGUN 2

Strategy games are well-known for pushing the limits of any system, and few others do this as well as Total War: SHOGUN 2. It fully supports DX11, has huge battlefields to oversee with hundreds or thousands of units, and a ton of graphics options to adjust. It’s quite simply a beast of a game.

Total War: SHOGUN 2

Manual Run-through: While we normally dislike timedemos, because strategy games such as this are very difficult to benchmark reliably, we’ve opted to use the built-in benchmark instead.

NVIDIA GeForce GTX 780 - Total War: SHOGUN 2 (1920x1080)

NVIDIA GeForce GTX 780 - Total War: SHOGUN 2 (1680x1050)

NVIDIA GeForce GTX 780 - Total War: SHOGUN 2 (5760x1080)

Finally, the game NVIDIA’s been long waiting for. Of all the games we bench with, I believe this to be the only one that doesn’t have an affiliation with either AMD or NVIDIA. You’d almost think otherwise given just how well it runs on NVIDIA hardware.

Game Tests: Borderlands 2, Metro: Last Light & BioShock Infinite

For this article, I picked three special games to benchmark thanks to their popularity; Borderlands 2, Metro: Last Light and BioShock Infinite. I had hoped to get a couple of others in, such as Hitman Absolution and Far Cry 3, but due to various hassles, I ended up scrapping them. However, the games that are here are quite good both from a graphics and gameplay standpoint.

Borderlands 2

For this game, I wanted to test with a place that featured a lot of PhysX, to see how the GTX 780 compared specifically to the GTX 680. So, I loaded up Frostburn Canyon, took a left and went up and over the bridge to do battle with a number of enemies there. For some reason, this is an area that utilizes PhysX a lot more than others, so it’s a great place to take in all of the eye candy. Details are completely maxed out for this game.

NVIDIA GeForce GTX 780 - Borderlands 2 (1920x1080)

NVIDIA GeForce GTX 780 - Borderlands 2 (4098x768)

NVIDIA GeForce GTX 780 - Borderlands 2 (5760x1080)

Well, these results are a bit interesting. With the game’s heavy use of PhysX, I had assumed the HD 7970 GHz would fall far behind, but not so. At our max resolution, it even manages to surpass the performance of the GTX 780. Not bad! Great performance all-around though for all cards, but not surprisingly, the GTX 780 reigns supreme.

Metro: Last Light

While I normally shy away from built-in benchmarks or timedemos, I’ve opted to use them for this game, and the next, on account of me never having played either (thus, no saved game). Fortunately, the built-in benchmark with Metro: Last Light is quite good, and thorough. Graphics options are mostly maxed, with the overall detail level sitting at High (not Very High), along with the Tessellation at Normal.

NVIDIA GeForce GTX 780 - Metro: Last Light (1920x1080)

NVIDIA GeForce GTX 780 - Metro: Last Light (4098x768)

NVIDIA GeForce GTX 780 - Metro: Last Light (5760x1080)

Domination by the GTX 780 continues here, with AMD’s HD 7970 GHz actually falling a fair bit behind. Given the GTX 780’s $649 price tag, it’s safe to say that at 40 FPS and 1080p, Metro: Last Light is going to remain in our test suite for a little while.

BioShock Infinite

Jamie raved over this game and for good reason. It’s an epic adventure, and it has some stellar graphics, chock-full of eye-candy. Of course, it was worth benchmarking, especially since the game perfectly supports multi-monitor out-of-the-box (cheers, Irrational Games!) Here, we simply use the built-in benchmark utility, with the graphics options all cranked in-game.

NVIDIA GeForce GTX 780 - BioShock Infinite (1920x1080)

NVIDIA GeForce GTX 780 - BioShock Infinite (4098x768)

NVIDIA GeForce GTX 780 - BioShock Infinite (5760x1080)

Given what we’ve seen between the battle of these GPUs up to this point, I’m a little surprised AMD’s HD 7970 GHz didn’t come closer to NVIDIA’s cards here than it has given BioShock Infinite is a title the company is heavily-promoting. Where it did crawl ahead was at 5760×1080. Once again, that 384-bit memory bus has allowed the card to breathe.

Synthetic Tests: Futuremark 3DMark, 3DMark 11, Unigine Heaven 3.0

Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark 11 is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.

Futuremark 3DMark 11

Similar to a real game, 3DMark 11 offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test. The benchmark doesn’t natively support 2560×1600, so to benchmark with that, we choose the Extreme profile and simply change the resolution.

NVIDIA GeForce GTX 780 - 3DMark 11 Performance

NVIDIA GeForce GTX 780 - 3DMark 11 Extreme

NVIDIA GeForce GTX 780 - 3DMark 11 Extreme

Taking the GPU scores into consideration only, 3DMark 11 puts the GTX 780 at 35% faster than the GTX 680 and HD 7970 GHz. 3DMark (2013) scales that back a bit, with GTX 780 still dominating, but only with a 15.5% advantage.

Unigine Heaven 3.0

While Futuremark is a well-established name where PC benchmarking is concerned, Unigine is just beginning to become exposed to people. The company’s main focus isn’t benchmarks, but rather its cross-platform game engine which it licenses out to other developers, and also its own games, such as a gorgeous post-apocalyptic oil strategy game. The company’s benchmarks are simply a by-product of its game engine.

Unigine Heaven 2.1

The biggest reason that the company’s “Heaven” benchmark grew in popularity rather quickly is that both AMD and NVIDIA promoted it for its heavy use of tessellation, a key DirectX 11 feature. Like 3DMark Vantage, the benchmark here is overkill by design, so results here aren’t going to directly correlate with real gameplay. Rather, they showcase which card models can better handle both DX11 and its GPU-bogging features.

NVIDIA GeForce GTX 780 - Unigine Heaven 3.0 (1920x1080)

Wrapping up our performance-testing, NVIDIA’s GTX 780 outperforms its GTX 680 by 17.6% and the HD 7970 GHz by 19.8%.

Temperatures & Power

To test graphics cards for both their power consumption and temperature at load, we utilize a couple of different tools. On the hardware side, we use a trusty Kill-a-Watt power monitor which our GPU testing machine plugs directly into. For software, we use Futuremark’s 3DMark 11 to stress-test the card, and techPowerUp’s GPU-Z to monitor and record the temperatures.

To test, the general area around the chassis is checked with a temperature gun, with the average temperature recorded (and thus noted in brackets next to the card name in the first graph below). Once that’s established, the PC is turned on and left to site idle for five minutes. At this point, GPU-Z is opened along with 3DMark 11. We then kick-off an Extreme run of 3DMark and immediately begin monitoring the Kill-a-Watt for the peak wattage reached. We only monitor the Kill-a-Watt during the first two tests, as we found that’s where the peak is always attained.

Note: (xx.x°C) refers to ambient temperature in our charts.

NVIDIA GeForce GTX 780 - Temperatures

NVIDIA GeForce GTX 780 - Power Consumption

NVIDIA’s GTX 780, as mentioned in the intro, has a higher TDP than the GTX 680 (it’s 250W vs. 195W) – an increase of 55W. In our testing, that value was almost perfectly reflected; we saw +58W. To be that spot-on with the TDP is rare; there’s definitely some truth to NVIDIA’s marketing here.

Beating both the GTX 680 and GTX 780 with both power and temperatures is AMD’s HD 7970 GHz. AMD’s cooler isn’t nearly as cool looking (in my opinion) as the one found on the GTX 780, but it does seem to be efficient.

With that, it’s onward to the final thoughts.

Final Thoughts

When NVIDIA released its GeForce Titan back in February, I admit that I figured it was a sign that we wouldn’t see the GeForce 700 series for quite a while. Well, I was wrong, and I’m glad I was. The GTX 780 is based on the same GK110 architecture as Titan, so by buying one of these GPUs, you’re effectively buying a $649 Titan (versus a $1,000 one).

This is where things become a little confusing. Titan and GTX 780 are both based on the exact same GK110 architecture, with Titan bundling in 17% more cores and 50% more memory. Are those benefits worth $350? It’s hard to say, and it’s also hard to say whether or not Titan will remain superior for non-gaming scenarios. It’s based on supercomputer intentions, after all, whereas GTX 780 targets gamers, plain and simple. Unfortunately, we don’t have a Titan here, so that sort of comparison testing will have to wait.

At the current time, 3GB graphics cards should be the norm, at least where high-end setups are concerned. I haven’t personally found games that would exceed 2GB (though I haven’t specifically tested for this at depth), but I have noticed some come close. I do see these memory requirements increasing in the future, especially for those with multi-monitor setups. For single-monitor (>1080p), I have a feeling that 3GB will be suitable for quite a while. Also, on the GPGPU front, if you can specifically exploit Titan’s larger framebuffer, that’s something to consider.

NVIDIA GeForce GTX 780

Let’s talk about another comparison, this time against the GTX 680. In the intro, I mentioned that NVIDIA pits the 780 to be 33% faster than the 680; in our 3DMark Fire Strike test, we hit 29%. For games, the average increase we saw hovered around 15~25%, with Metro 2033 being an insane exception to showcase a 55% boost.

Had we adopted the use of more games NVIDIA itself tested with, we could have seen the average become closer to its reported 33%. Here are a couple of examples of games the company tested and its performance increase over the HD 7970 GHz (we can’t validate these, of course): Crysis 3 (+20%), Far Cry 3 (+33%), Batman: Arkham City (+29%), Assassin’s Creed 3 (+32%) and Max Payne 3.

Once our GPU testbench is updated, I plan to revisit this card and compare it to the HD 7970 GHz (and other cards) once again, but with some additional games – the requirement being that they are still relevant. AMD also has a couple of games that it’d like us to test with, but we haven’t been able to up to this point (Tomb Raider, Hitman: Absolution, namely).

NVIDIA GeForce GTX 780 PCB

With its GeForce GTX 780, NVIDIA can again claim that it has the world’s fastest single-GPU solution for under $1,000. At its price-point of $649, the GTX 780 doesn’t compete with anything directly. AMD’s Radeon HD 7970 GHz Edition can be had fairly easily for $450, while the GTX 680 sits closer to $500. For its $200 premium, is NVIDIA’s GTX 780 worth it?

That depends on your value perception. The GTX 780 didn’t often blow away the HD 7970 GHz as I’m sure NVIDIA had hoped, although again, we didn’t test with a couple of notable games I would have liked (where NVIDIA’s testing did see bigger gains). NVIDIA also has a couple of perks that some might like, such as PhysX and CUDA.

Here’s something else to consider. The main reason AMD’s HD 7970 GHz was able to keep up to NVIDIA’s GTX 780 so well owes its thanks to the progression of the Catalyst driver. When I benchmarked AMD’s card last September, it scored 2575 in the GPU test in 3DMark 11 (Extreme). With this week’s testing, that number became 2,999 – a 16.5% increase. That’s stark. I use 3DMark as an example, but comparing games, increases of 7~15% were also seen.

Experience GeForce

For all of the 1080p results in this review, aside from those from the three special games, I included GTX 680 results using both the 306 and 320 drivers. At first, I did this just for fun, to show how well drivers can improve performance over time. But after seeing the staggering increases of the HD 7970 GHz, I realized it was important to include. If you go back and look at some of those results, you’ll be able to see a trend – driver releases continue to increase performance for a given GPU well past its launch. Today’s GTX 780 performance isn’t going to be next year’s GTX 780 performance, whereas the HD 7970 GHz has likely almost reached its peak (that family of cards has been available for 16 months, after all).

Still, is that worth $200 over the HD 7970 GHz? That’s for you to decide.

As it is, NVIDIA offers a very attractive offering with its GeForce GTX 780. While it’s carrying a $150 launch premium over last year’s GTX 680, it offers great performance, runs super-quiet even at full load, and has other niceties such as GPU Boost 2.0 with much improved variables for overclocking. It’s also the best-looking card (again, to me) ever produced, if that counts for anything.

Copyright © 2005-2019 Techgage Networks Inc. - All Rights Reserved.