NVIDIA’s next-gen GeForce series is here, and it brings with it a slew of new features and enhancements worth knowing about. Based on Maxwell, the GTX 900 series delivers much-improved performance-per-watt, with the GTX 980 in particular performing better than the 780 Ti – but with a TDP of 85W less. You read that right. Let’s dig in.
Introduction
Seven months ago, NVIDIA released a graphics card that captivated me. It was called the GeForce GTX 750 Ti, and it cost a mere $150. Admittedly, it looked like a card that cost even less, and it didn’t even have a power connector. So at first, I wasn’t sure what to expect, but after diving into testing, I quickly realized that the modest 750 Ti had some serious brawn.
Under the hood of that 750 Ti? A chip packed with NVIDIA’s latest architecture, Maxwell. After seeing that the lowly 750 Ti could handle all of today’s games at 1080p and with decent framerates, I couldn’t help but wonder what Maxwell could do for the higher-end part of NVIDIA’s product stack. I didn’t realize it’d take more than half a year to find out, but nonetheless, the wait is over.
Today, NVIDIA’s releasing two Maxwell-equipped graphics cards: The $329 GTX 970, and the $549 GTX 980. We’re going to kick things off with a look at the GTX 980, and you can expect our look at the GTX 970 to follow shortly.
Anyone familiar with NVIDIA’s last-gen high-end cards will immediately recognize the cooler used here. Without a doubt, I consider this cooler to be the best graphics card cooler ever built, so I’m glad to see that NVIDIA didn’t decide to forego it this round. While it’s sometimes nice to see a fresh design each and every year, it simply wasn’t needed here, and because of that decision, any NVIDIA fan who skipped over the 700 series generation now has a second chance at owning a card featuring this hardcore looking cooler.
But while on the surface, the 900 series coolers look like those from last-gen, there are a couple of improvements, both under-the-hood and on it. For starters, both the 980 and 970 have a backplate for improved cooling, and on this backplate is a removable section that aides further with the goal of improved airflow for those who have window-mounted fans.
The photo below shows us a new triangular design that graces the barren area of the panel, but that’s not what’s important here. What is, is the fact that NVIDIA’s culled one of the DVI ports on its reference card in order to make more room for two more DisplayPort connectors. That gives the 980 a total of 3x DisplayPort connectors, 1x HDMI, and 1x DVI.
For those planning to go the multi-monitor route with displays that all use DisplayPort, this is fine, but clearly, that’s a small percentage of people. Because of this, I’d wager that most vendors will opt for a different collection of connectors. For those that do go with the 3x DP configuration though, it seems likely that there would be a DisplayPort-to-DVI adapter included in the box.
It should come as a surprise to no one that Maxwell delivers a higher performance-per-watt over Kepler, but because of that, the table below isn’t quite as useful this time around as it normally is. Based on core counts alone, it looks like the TITAN Black would blow the GTX 980 out of the water, but that’s not the case. Instead, the only “on paper” spec worth noting here is that both the 980 and 970 (finally) include a 4GB framebuffer. Don’t let that 256-bit interface fool you. I’ll cover on the next page why despite its tighter bus, Maxwell is more memory-efficient than Kepler.
NVIDIA GeForce Series
Cores
Core MHz
Memory
Mem MHz
Mem Bus
TDP
GeForce GTX 980
2048
1126
4096MB
7000
256-bit
165W
GeForce GTX 970
1664
1050
4096MB
7000
256-bit
145W
GeForce GTX TITAN Black
2880
889
6144MB
7000
384-bit
250W
GeForce GTX 780 Ti
2880
875
3072MB
7000
384-bit
250W
GeForce GTX 780
2304
863
3072MB
6008
384-bit
250W
GeForce GTX 770
1536
1046
2048MB
7010
256-bit
230W
GeForce GTX 760
1152
980
2048MB
6008
256-bit
170W
GeForce GTX 750 Ti
640
1020
2048MB
5400
128-bit
60W
GeForce GTX 750
512
1020
2048MB
5000
128-bit
55W
Note that with this launch, NVIDIA is officially discontinuing the GTX 780 Ti, 780, and 770.
Here’s the table that really matters:
NVIDIA GeForce Series
GFLOPs
TDP
GFLOPs Per Watt
GeForce GTX 980
5,000
165W
30
GeForce GTX 970
4,000
145W
27.5
GeForce GTX TITAN Black
5,100
250W
20
GeForce GTX 780 Ti
5,050
250W
20
GeForce GTX 780
4,000
250W
16
GeForce GTX 770
4,000
230W
17
GeForce GTX 680
3,000
195W
15
With its peak performance of 5 TFLOPs and its power rating of 165W, the GTX 980 delivers 30 GFLOPs-per-watt. That might not seem like a big deal without some context, but that’s 50% more performance-per-watt over the GTX 780 Ti, and 100% more than the GTX 680. That’s quite an impressive gain in just two-and-a-half years.
Even a table like this doesn’t tell the whole performance story, though. Take for example the fact that the 780 Ti and 980 offer roughly the same amount of performance. What we’ll actually see in our testing is that the 980 is about 10% faster than the 780 Ti in most cases. Some gain comes as a direct result of the Maxwell architecture, but the higher clocks of the 980 don’t hurt either.
Due to time and having felt under-the-weather all week, I was unable to fully benchmark the GTX 970 in time for launch, but I wanted to give a tease of what model we’ll soon be talking about. It’s from ASUS, and it’s called the “STRIX” edition. The name might be a little strange (at least, it is to me), but this is one card I can’t wait to dig into:
Every single graphics card vendor on earth touts their cards as being “quiet”, but with STRIX, ASUS might actually be able to live up to that promise. On the box is mention of “0dB silent gaming”, and while Maxwell’s power efficiency no doubt deserves some of the credit for making that possible, the interesting looking fans no doubt play a role as well, as well as ASUS’ own blend of power optimizations.
You’re probably all desperate to see what the GTX 980 brings to the table in terms of performance, but first, we’ll take a look at Maxwell’s most important features. If you just want to see performance, hop on over to page four.
Yeah, after seeing what this is capable of, I am looking forward to wrapping-up benchmarking on the 970. Two of those in SLI would run you about $660… and the power would be immense.
Models are popping up online all over the place. Most of what I see at the moment are cards that match the reference. I am sure we’ll get inundated with third-party cards soon.
JD Kane
I wonder how soon we’ll start to see full-coverage water blocks for these…
It’s been a while since I’ve been this excited over a new GPU.
The Focus Elf
Same here JD! Drooling over here.
Jay Jardin
4GB is bullshit and not enough to max out old crysis on 4k and more. This is going to be another case of wait a year to buy the 6GB card like they did with the 780’s. As long as there are only two companies on this business we are never going to get what we want. Potato resolution users rejoice.
I just ran Crysis 3 with Very High detail levels @ 4K and it used 2.3GB of VRAM. The same detail levels topped out at 2.0GB at the multi-monitor resolution of 5760×1080.
Jay Jardin
don’t be shy. Max all the settings and watch your GPU be bored and your screen do 1 fps because your vram chokes. They are going to release a 6gb card a year from now like they did with the 780. Those bastards.
As I said, I used Very High detail settings – as in the presets. In order to increase VRAM further, I’d have to move up to a different anti-aliasing setting (from FXAA), which isn’t worth it.
Also, at that 4K resolution, I was getting 20 FPS with that Very High detail. The VRAM isn’t the issue, available horsepower is.
Jay Jardin
I’ts not worth it to you. You are making this more difficult than needs be. Grab crysis, take every setting and max it out individually. Your computer will choke and not because of GPU power but because you ran out of vram to power the 4k. Hence why they released 6GB cards. Now they are releasing 4GB cards again for people like you that don’t mind. When I spend 1100 for just two video cards I expect them to play at max settings not console preset.
I don’t believe in overkill just for the sake of overkill. If someone purchases a second GPU, it should give them some obvious gain, not just some e-peen satisfaction. With 4K, you’re already cramming an absurd number of pixels into a ~27-inch display – so why on earth would FXAA not suffice? That’s an effective AA mode at lower resolutions; it’d be extremely suitable for 4K.
I’d take 1440p displays in Surround with very high detail levels over a 4K display with absurd AA levels any day. At least then I’d be improving immersion and my experience overall.
Jay Jardin
Do we agree that 4gb of vram is not enough to handle old crysis 3 4k maxed out yet? What you call overkill I call making the best out of high end hardware. 27″ is too tiny for 4k I am guessing you noticed when you tried to read the first webpage and had to zoom in. If antialiasing is not worth to you I’ll have to take your word for it. I can tell the difference and it is glorious. Now I need for this bastards to stop messing around and ship 6gb cards like they did with the 780. In a few months they will release 6gb cards so we have to purchase twice. It’s playing dirty.
Alright – I ran the game maxed-out (4x TXAA) @ 4K and 3.4GB of VRAM was used (over a painful 10 minute run @ about 13 FPS average on a GTX 970 – ow). So, I concede that if you want to use the highest AA available at 4K, more than 4GB would be welcomed. But, it must be said that given I hit 20 FPS with FXAA @ Very High 4K with a 980, at least 3x 980 would be needed for 60 FPS – and that’s still not taking into account the TXAA-esque AA you’re after.
“It’s playing dirty.”
I can understand why you’d think that, but TITAN Black is still considered a current card. NVIDIA wasn’t about to bastardize sales of that, as unfortunate as it is. I wouldn’t be surprised to see a TITAN replacement in six months or so that gives us 8GB of VRAM and 2880 cores or so. Then maybe NVIDIA will give you the 6GB on the regular 980 you want so much.
Jay Jardin
The 780 was released with 6GB. You are telling me that you do not find the “upgrade” 980 with 4GB strange?
You’re confusing NVIDIA with its partners. Companies like EVGA released 6GB 780s nearly a year after the launch of the 3GB model. NVIDIA prevented the same companies from releasing a 6GB 780 Ti because it’d essentially match the TITAN Black – not counting that card’s enhanced double-precision performance.
You’re comparing what NVIDIA’s done to what third-party partners did – that’s nonsense.
Jay Jardin
“third party” is the only party that releases nvidia cards. Nvidia doesn’t have a direct to consumer distribution. The fact that Asus, Palit, EVGA and maybe more released a 6GB card shows that I am not alone on the 4GB vram is too little spectrum. If you want non-sense look at the introduction of the Titan series. That should be the new x80 line. Adjusted for inflation to around 700-750 a piece.
As I said, those third-parties released 6GB cards long after the initial launch of the 780. It’s no surprise that the 980/970 didn’t launch with 6GB models since the demand is almost non-existent.
Jay Jardin
The demand is as big as the demand for 980’s. We are not paying 1000 plus just so we can play at stock settings.
Oh – I agree. But it’s not as though 6GB cards are not coming; they just were not available for launch. In a perfect world, the higher-density cards would have shown up at launch, but the demand is so low that it makes business sense to focus on the core models first.
I remember a rumor that said this generation should have 8GB cards as well. I wish that meant that PC games would actually start being able to take advantage of the higher VRAM. Anti-aliasing shouldn’t be the only reason higher VRAM is needed.
Jay Jardin
Here. Shadow of Mordor 6GB of vram. I just ordered the 4GB 980’s today. :/
Oh, but he goes into that technical data? I’m not huge into watching videos, and much less didn’t want to spend 50 minutes looking through that one. Installing and will test later when I get a chance.
Jay Jardin
You have to install the 9GB ultra texture pack otherwise when you select ultra you are just playing on High.
Cheers, I would have overlooked that. Had to clear off some stuff on that drive before I could even install it. 35GB base install… damn.
Jay Jardin
What tool are you going to use to know the amount of vram use? Also can you do me a favor and put EVERY setting to max. I want to know how stupid my 1200 dollar purchase of 4gb cards was today.
I don’t have a second GTX 980 (yet), so testing will just be done with one. And I read somewhere that the game has a built-in benchmark, so I’ll just use that (I don’t have interest in the game, though it does look like it will be in our next GPU suite update, given the obvious). Also for VRAM usage I’ll use AIDA64, since it spits out the results in a nice format.
Jay Jardin
This game does not do SLI. Also vram doesn’t stack so it doesn’t matter. I already knew 4GB was not enough from playing Crysis 3. BTW I had your cpu, it was super fun. I sold it to get an 8 core xeon.
This CPU has served me well. I just inherit whatever I have in the GPU test rig after I retire a platform. I was going to upgrade the 4960X to the 5960X in that machine, but when it comes to game benchmarking, I’d rather stick to the 4.5GHz overclock on the 4960X to feel safer about the results.
Of course… that’s until games come out that saturate eight cores.
As for the game, what I find odd is that it’s NVIDIA-sponsored, but there’s no mention of it on GeForce.com. There’s a special AO setting, but that’s about it. I figured there’d be at least TXAA in here or something, but no.
I’ll soon be getting a Quadro K5200 in, and while it’s not a gaming card, it -does- have an 8GB framebuffer. Once it gets here I’ll revisit this game to see if it manages to saturate that as well.
Jay Jardin
After over 10 years of buying 1000 dollar CPU’s I will not do it again. I just bought parts for my x99 rig. I am getting the fastest CPU (that is the cheapest one) and using that extra money to start running 3 cards instead of 2. I rarely edit video so I’ll rather have the extra 10-30fps from a third GPU than another extreme/xeon.
$1,000 CPUs were a little easier to justify years ago, but nowadays, most people can get by with midrange parts just fine, and then overclock them if need be. The 5960X is the first $1K chip in a while to potentially be worth it, because you just can’t get that kind of throughput anywhere else. Before that, the 980X I suppose was worth it also, since it was the first six-core. The ones in between that were so minor in advancement that they just were not exciting.
So, I’m with you – save $500 on the CPU and add another GPU. The CPU is not the bottleneck right now while the GPU is, so it makes all the sense in the world to maximize your graphics output.
TG’s FB link is on the right side of the site; if you mean my own, I can be found in the comments on that page, you can just click me.
Jay Jardin
I also had a 980x but that is really old from the x58 days right? I meant your own fb. I am a bit thick I didn’t realize you worked for a review website. My current bottleneck is my poor decision of purchasing gtx 980’s instead of 6gb gtx 780’s. I also do not have a 60fps 4k since none of them exist at an acceptable size until next year when seiki releases one. On my 4k monitor I am stuck with 30fps that I use on console ports like watch dogs etc and 120fps if I swith to 1080p.
You didn’t realize the person talking to you had the same name as the author of the review? ;-)
Yes, 980X was X58. I can barely even remember it now, because tech moves so fast. I remember being stoked at the fact that it was a six-core CPU though. I punned it and said it delivered “Sick Scores”.
30 FPS on a 4K is harsh, as is the wait for larger 970/980s. I’d ask NVIDIA for a timeline, but I know I won’t get one (it doesn’t want to bastardize current models). Your best solution, even though it’s not ideal, is to offload your cards when the 6~8GB models finally do get pushed out the door. Chances are they will still be worth a pretty penny.
Jay Jardin
I did not realize you were the author. I read headline first. Then charts. Then scan for content pending to charts. Then I comment. 30fps for console ports is ideal, mostly because they tend to be cinematic and that is the frame rate they were designed for. At the same time when I got my 4k monitor it was during the reign of the 780 and two 780’s could barely reach 30fps on the games I like (metro last light for example). The 120/60fps I got on 1080p do not feel as good. I can’t sell the 980’s I just lost a lot of money by buying 3 290x cards that perform very poorly and had the resale value of an american car. I will never touch AMD again. P.S. I have over 20 used 290’s to sell I wanted to see if there is a market for them where you live. At a steep discount.
“I just lost a lot of money by buying 3 290x cards that perform very poorly and had the resale value of an american car.”
For what it’s worth, the last bit made me laugh. Also, over 20x 290? That is unbelievable. You might want to hit up Reddit’s /r/hardwareswap section to see if you can offload a bunch of them. Sadly, it’s harder than ever to sell those given this new NVIDIA series, but I obviously don’t need to tell you that…
Jay Jardin
I need to make a correction here. VRAM is stacking on Shadow of Mordor. @4k resolution/HD textures and every setting all the way up is using 11000mb out of my 12000mb vram. So it would take 3 980’s to load those textures or 2 780 6GB cards.
I talked to NVIDIA about this, and was told that Mordor is reporting its VRAM usage incorrectly. Something I didn’t realize is that tools like GPU-Z and AIDA64 do not actually tap into the GPU itself to get these readings; they instead base the VRAM usage on what the apps are telling them. NVIDIA compared it to a real-life reporter who can report on something incorrectly because their source is likewise incorrect.
I guess it does make sense, because even though the game supposedly hogs the entire 4GB (or close to it) on this 980, the game never stutters. If the memory was truly being fully saturated, that wouldn’t be the case. I had just figured that the game used some sort of special caching, but NVIDIA’s final answer is that the game is simply reporting the VRAM usage incorrectly.
Kayden
Makes sense because the Ultra textures pack says it requires 6GB of VRAM and I cannot see why that would be necessary. Hell I don’t see how the 4GB is recommended for High. I can make Skyrim look much better than that game and I wont hit 3GB even on multi-monitor. I don’t generally run AA at that resolution, just so you know.
I am wondering if there’d be a difference in visual detail at all if the game is run on a 4GB card or an 8GB one. This is the kind of thing that would be extremely difficult to get a good answer from the devs for. I had a hard enough time coming to a conclusion with NVIDIA.
Kayden
I don’t see how there would be any difference tbh. Unless there isn’t enough memory to buffer say 4k textures but even that is hard to swallow with how this game looks. TBH I think AC4 looks better.
Jay Jardin
Let’s all sit here and criticize how the game looks by all that have never seen it at 4k, with AA and with ultra textures. That makes sense. @RobWilliamsTG:disqus 8GB useless for “mobile gpu”. There is a wave of portable desktops (they look like laptops) and have minuscule performance differences vs desktops (mostly because desktop users tend to only have 1 GPU). In six months or less nvidia will go back to 6GB on the same cards selling now. The 880 was supposed to have 8GB on the desktop as well but we got the finger instead.
“Let’s all sit here and criticize how the game looks by all that have never seen it at 4k, with AA and with ultra textures.”
4K isn’t that big of a deal… it just isn’t (it’s still 16:9 like 1080p/1440p… except it requires way more GPU horsepower). And resolution has nothing to do with being able to figure out whether one game looks better than another; if you compare two games side-by-side at the same resolution, you’re going to be able to reach the same conclusion.
I haven’t played either game enough (and Mordor at all) to throw an opinion at what looks better. I just know that the quick 30 second benchmark Mordor offers looks GOOD. Good enough where I might actually try the game out at some point in the future, despite not knowing anything about the LOTR series..
On the mobile front, I can understand if Maxwell mobile GPUs have 4GB, since their performance is supposed to be about 80% of the desktop cards – meaning 1440p will be a real option on notebooks with a 980M. 8GB is just… nonsense.
I haven’t disagreed that NVIDIA could have released higher-density cards at launch, but it’s not something I’d criticize the company for. I still haven’t had undeniable proof that 4GB isn’t enough. We’ve already established that 4GB isn’t a limiting factor in Mordor at 4K. I know that Skyrim mods can easily hit the VRAM hard, but will that even surpass a legitimate 4GB?
Jay Jardin
“We’ve already established that 4GB isn’t a limiting factor in Mordor at 4K.” Assumption: You never played the game. You think the benchmark looks playable so the game is playable. It is not. “I still haven’t had undeniable proof that 4GB isn’t enough.” (You are in denial) Please turn AA on and say goodbye to your vram. I don’t lower my settings and pretend the game looks the same. “but will that even surpass a legitimate 4GB?” This one comes up over and over (from you). Mordor can do at least 8GB, I know some guy at nvidia told you it reported wrong but not by how much. Crysis 3 also does over 4GB of vram at 4k with all settings up. Same with Metro Last Light. Benchmarks are great but the real test is the game. A benchmark won’t tell you when the textures don’t load because you have as much vram on a gpu as 3 years ago. “On the mobile front… 8GB is just nonsense.” There are people with 4k laptops and some that play surround. A $2000-$3000 laptop is no joke. … and last “4K isn’t that big of a deal…” I have done nothing in gaming but pursue resolution for over 10 years. Packing more pixels in less space is an incredible achievement and quite noticeable. I have been playing with 3 monitors since the GTX 295 and dropped almost a year ago when I could replace it with a single 4k panel. I can’t see myself “arguing” about video gaming with someone that doesn’t see the benefits of 4 times 1080p. That one just leaves me speechless.
“Mordor can do at least 8GB, I know some guy at nvidia told you it reported wrong but not by how much.”
Due to the fact that I simply don’t have the time or desire to test the game further at the moment, I am just choosing to trust the expert at NVIDIA. You mentioned AA; where is that setting? Did you force it through the control panel? If you’re experiencing issues in Mordor at 4K, such as slow loading textures and things like that, that’s something I’d like to test out in the near-future.
“I can’t see myself “arguing” about video gaming with someone that doesn’t see the benefits of 4 times 1080p. That one just leaves me speechless.”
I run 1440p… so yes, I see the benefits of higher resolutions. But to me there’s a point when things become “good enough”, and moving higher is overkill. 4K, to me, is overkill – a territory where e-peen and bragging rights are more important than the pixels themselves.
If the GPU horsepower required for 4K wasn’t so high, I’d be far more accepting of it. As it is, I find it a little silly to opt for a solution that requires twice or quadruple the processing power when the result is the exact same 16:9 scene. I’d rather stick with a single 1440p and not have to spend $2K on GPUs to compensate for the huge performance drop. Even more, I’d rather 3 x 1440p, because at least then I’d be seeing up to 3x the visible scene.
If game developers start taking better advantage of 4K, meaning that the graphics are -enhanced- and not just bumped in resolution, then my tune will change. I simply could never justify requiring so much extra GPU horsepower when I consider 1440p to be suitable and requires half the GPU horsepower. The biggest beneficiary of 4K are companies like AMD and NVIDIA.
“A benchmark won’t tell you when the textures don’t load because you have as much vram on a gpu as 3 years ago.”
I understand that, but that kind of issue is hard to quantify, especially when the game can’t report on VRAM usage correctly. You mentioned also that Crysis 3 can break past 4GB, but I wasn’t able to make that happen. I played the game for 20 minutes at 20 FPS just to see if I could saturate the entire VRAM, and couldn’t (I hit 3.3GB, though).
I think it’s obvious we’re not going to see eye-to-eye on this. I appreciate your perspective on the issue, and regret I just can’t agree on everything.
Jay Jardin
AA is Anti-Aliasing. If you turn it off you can play anything with a 3GB vram card. I think 3 GTX 980’s are not enough to max out this game, I won’t find out until they become available for purchase. I hope it takes months, in that way I will be able to buy the 6-8gb card.
Jay Jardin
“The game never stutters”, is it on the benchmark or during gameplay? I am using 3 290x cards and I had to lower my resolution to 1080p to make it playable. Ok so the size reported is wrong, kind of makes sense. It is showing 10gb+ at 1920×1080, hd textures everything turned all the way up and 11GB+ @ 4k. I still think nvidia is just being cheap on vram. The GTX 880M was released with 8GB of vram (laptop video card).
It was with the benchmark. I haven’t had the time or interest to get into the game itself, though I will if it ends up being a game worth adding to our next GPU test suite.
And yes, if it shows 10GB+ for 1080p, something is definitely up with how the app reports the VRAM. I am sure this isn’t totally uncommon; I normally don’t pay attention to VRAM at all.
8GB of VRAM on a mobile chip is about as pointless as it gets.
I need to ping NVIDIA about this, to understand the whys. I’m wondering if the game is designed to saturate all of the VRAM deliberately, or if the game genuinely requires that much. It’s a sharp game, but from what I could tell, it didn’t even have anti-aliasing enabled (there’s no option for it).
Jay Jardin
The demand is as big as the demand for 980’s. We are not paying 1000 plus just so we can play at stock settings.