Techgage logo

AMD Radeon R9 280X Graphics Card Review

Date: October 8, 2013
Author(s): Rob Williams

Most next-gen GPU launches are a simple affair: Launch one model, then another, and then another. AMD’s latest series is a bit different. In advance of its forthcoming flagship R9 290X, the company decided to push all of its mainstream parts off of the truck at once. So, let’s get started, first with a look at the $299 R9 280X.


Can you smell that? No, not me! I’m talking about that fresh stack of AMD GPUs sitting on the desk over there *points to the right*. Sheesh.

That’s right; it’s GPU launch time again, and AMD sure had no intention of easing into things. For the first time in quite a while (if not ever; my memory is horrible), AMD isn’t releasing just one or two GPUs today, but four: Radeon R9 280X, R9 270X, R7 260X and R7 250. Today, we’re going to be taking a look at the R9 280X, with looks at the others to come soon.

You might recall that AMD held a “Tech Day” in Hawaii a couple of weeks ago. Much was discussed there, so we’re not going to rinse and repeat what I’m sure you already know. Instead, we’ll wait for some of the technologies to become viable in the real-world so that we can test and relay our experiences.

AMD Radeon R9 280X
AMD’s Radeon R9 280X

The reason AMD held its event in Hawaii is because “Hawaii” is the codename for its next high-end part, which is bringing quite a bit to the table (not all of it which we can talk about at this point). The R9 290X will be AMD’s highest-end for this new generation, codenamed Hawaii XT. The R9 290 will sit just below that, codenamed Hawaii Pro.

AMD’s current and upcoming lineup can be seen in the table below:

AMD Radeon Cores Core MHz Memory Mem MHz Mem Bus TDP Price
R9 290X ??? ??? ??? ??? ??? ??? ???
R9 290 ??? ??? ??? ??? ??? ??? ???
R9 280X 2048 <1000 3072MB 6000 384-bit 250W $299
R9 270X 1280 <1050 2048MB 5600 256-bit 180W $199
R7 260X 896 <1100 2048MB 6500 128-bit 115W $139
R7 250 384 <1050 1024MB 4600 128-bit 65W $???

AMD is throttling the amount of information that can be revealed about the 290 and 290X, but what I can say is that they’re going to be pretty interesting in a couple of different ways. Whether or not what those cards will bring to the table will make NVIDIA quiver in its boots, I’m not quite sure. We’ll have to wait and see. One thing’s for sure: AMD is damned confident about its product.

While AMD sent us reference versions of its R7 260X and R9 270X to take a look at, the R9 280X we received came courtesy of PowerColor. Except for the PCB color and the back ports, the cooler on the card I received is identical to the one seen below. The particular card I have isn’t even listed on PowerColor’s site, but it runs at reference clocks. Instead of 2x DVI + HDMI + DP, my sample has 1x DVI + HDMI + 2x mini-DP.

PowerColor Radeon R9 280X

It’s not often that we hear (no pun) a GPU vendor talk about audio, but that was one of AMD’s major focuses at its event in Hawaii. With its DSP “TrueAudio”, the company (partnered with others) hopes to redefine our expectations of gaming audio. A common goal with audio is to allow us to pinpoint where a sound is coming from; that’s one of AMD’s goals here, but what makes it all the more interesting is that we’re meant to get that benefit even from stereo output. Color me intrigued.

AMD TrueAudio Advantages

I am not an audio guy – far from it – so there’s not a whole lot I can talk about at this point. There’s no way for anyone outside of AMD and its partner companies to test TrueAudio at this point, so right now, it’s just a waiting game until support gets here. Here’s the takeaway, though: Game developers have long had programmable shaders to work with; picture the same sort of flexibility with audio.

While not an issue on everyone’s mind, AMD is also taking advantage of this launch to bolster its 4K capabilities. The company is pushing for a new VESA standard that allows two outputs to drive two separate streams to a single 4K display. This would allow us to get over the hurdle of being stuck at 30Hz @ 4K, a typical problem at the moment (and obviously an issue that’s evident for gaming). Down the road, more advanced connectors should make this kind of work-around unnecessary.

Also on the display front, AMD’s latest GPUs continue to support 3 monitors off the same card, or 6 when a DisplayPort extender is used.

While there is much more we could talk about, such as AMD’s game push, this has been a rather intensive article to get out, and there are still other GPUs to be benchmarked. So, that said, let’s get right into a look at our (revised) GPU testing methodology, and then tackle our first results.

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a detailed look at how we conduct our testing.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the GPU. Each card used for comparison is also listed here, along with the driver version used.

  Graphics Card Test System
Processors Intel Core i7-4960X – Six-Core @ 4.50GHz
Motherboard ASUS P9X79-E WS
Memory Kingston HyperX Beast 32GB (4x8GB) – DDR3-2133 11-12-11
Graphics AMD Radeon R9 280X 3GB (Catalyst 13.11)
NVIDIA GeForce GTX 780 3GB (GeForce 331.40)
Audio Onboard
Storage Kingston HyperX 240GB SSD
Power Supply Cooler Master Silent Pro Hybrid 1300W
Chassis Cooler Master Storm Trooper Full-Tower
Cooling Thermaltake WATER3.0 Extreme Liquid Cooler
Displays ASUS PB278Q 27″ 2560×1440
Dell P2210H 22″ 1920×1080 x 3
Et cetera Windows 7 Professional 64-bit

When preparing our testbeds for any type of performance testing, we follow these guidelines:

General Guidelines

To aid with the goal of keeping accurate and repeatable results, we alter certain services in Windows 7 from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing inaccurate test results. For example, disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

The services we disable are:

For further fine-tuning, we also use Windows’ “Classic” desktop theme, which gets rid of the transparency that can sometimes utilize a GPU in the background.

Vendor Favortism

Sometimes, either AMD or NVIDIA will work with a game studio to help their development process along. As history has proven, this often results in a game that is tuned better for one vendor over the other, although sometimes the tides can change over time, resulting in the opposite vendor offering the better experience.

One of our goals is to provide as neutral a benchmarking suite as possible, so while it’s impossible to avoid games sponsored by either of these companies, we can at least make an effort to achieve a blended list. As it stands, our current game list and their partners are:

(AMD) – BioShock Infinite
(AMD) – Sleeping Dogs
(AMD) – Crysis 3
(NVIDIA) – Battlefield 3
(NVIDIA) – Metro: Last Light
(NVIDIA) – Splinter Cell Blacklist
(Neutral) – Total War: SHOGUN 2
(Neutral) – GRID 2

With that, let’s move on to a quick look at the game settings we use in our testing:

Battlefield 3

Battlefield 3 Benchmark Settings

BioShock Infinite

BioShock Infinite Benchmark Settings

BioShock Infinite Benchmark Settings

BioShock Infinite Benchmark Settings

Crysis 3

Crysis 3 Benchmark Settings

Crysis 3 Benchmark Settings


GRID 2 Benchmark Settings

GRID 2 Benchmark Settings

GRID 2 Benchmark Settings

Metro Last Light

Metro Last Light Benchmark Settings

Sleeping Dogs

Sleeping Dogs Benchmark Settings

Sleeping Dogs Benchmark Settings

Splinter Cell Blacklist

Splinter Cell Blacklist Benchmark Settings

Splinter Cell Blacklist Benchmark Settings

Total War: SHOGUN 2

Total War SHOGUN 2 Benchmark Settings

Unigine Heaven

Unigine Heaven 4 Benchmark Settings

Game Tests: Battlefield 3, BioShock Infinite

Battlefield 4‘s launch is right around the corner, but despite there being a beta available, I’m sticking to Battlefield 3 for the sake of reliable benchmarking until the dust settles. Even though BF3 was released in 2011, it remains gorgeous at maxed-out detail settings, and a good challenge for today’s GPUs – though the real challenge begins above 1080p for mainstream and higher parts.

Battlefield 3 - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

Battlefield 3 - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Manual Run-through: Operation Guillotine (chapter 5) is the basis for my benchmarking here, as it features a lot of smoke, water, debris and is reliable for repeated benchmarking. The level starts us out at the top of a hill, and after descending it, we go over a fence and through a riverbed. I stop the play-through after reaching the first set of stairs.

AMD Radeon R9 280X - Battlefield 3 (1920x1080)

AMD Radeon R9 280X - Battlefield 3 (2560x1440)

AMD Radeon R9 280X - Battlefield 3 (4800x900)

AMD Radeon R9 280X - Battlefield 3 (5760x1080)

AMD is off to a great start here. Given the fact that the R9 280X costs $50 more than NVIDIA’s GTX 760, it seemed certain that it’d be the better performer overall, but based on Battlefield 3 alone, that 20% increase in cost can net you between a 30~40% boost in performance.

At 1080p, both cards here perform quite well – NVIDIA sits behind, but still delivers 60+ FPS on average. And this, it must be mentioned, is on Ultra. Using our multi-monitor resolutions, even 4800×900 becomes a bit much for either of the cards. Your option here would be to drop from Ultra to High, or turning off AO and AA in particular, to achieve respectable performance.

BioShock Infinite

Sometimes, the hype that follows a game to its launch can be a little ridiculous, and all too often, it doesn’t live up. BioShock Infinite (our review), though, is one of those rare instances where reality exceeded expectations. Infinite‘s world is immersive and chock-full of eye-candy, and its gameplay mechanics and AI help craft something truly special. This is a must-play game, it’s that simple.

BioShock Infinite - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

BioShock Infinite - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Built-in benchmark: Finding a “perfect” level for manual benchmarking proved difficult with this game, as numerous variables arose in each one I tested that made them less-than-ideal. Fortunately, the game’s built-in benchmark is effective, so while I hate to forego the manual route, I feel confident in the results generated with this tool.

AMD Radeon R9 280X - BioShock Infinite (1920x1080)

AMD Radeon R9 280X - BioShock Infinite (2560x1440)

AMD Radeon R9 280X - BioShock Infinite (4800x900)

AMD Radeon R9 280X - BioShock Infinite (5760x1080)

It’s almost as though both BF3 and BioShock Infinite share the same graphics engine, because both performed similarly across the board. Both cards once again are suitable at 1080p, while the R9 280X handles 1440p quite well. Moving into multi-monitor territory once again, this game proves to be a little too much for either card.

Game Tests: Crysis 3, GRID 2

When the original Crysis dropped in late 2007, it took no time at all for pundits to coin the phrase, “Can it run Crysis?“, almost to the point of self-parody. At the time, the game couldn’t have its graphics detail maxed-out on even top-of-the-line PCs, and in reality, that’s a great thing. I’d imagine few are opposed to knowing that a game could actually look better down the road as our PCs grow into them. As the series continued, Crytek knew it had a legend to live up to, and fortunately, Crysis 3 (our review) lives up to the original’s legacy.

Crysis 3 - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

Crysis 3 - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Manual Run-through: There’s no particular level in Crysis 3 that I could establish was “better” for benchmarking than another, but I settled on “Red Star Rising” based on the fact that I could perform a run-through with no chance of dying (a great thing in a challenging game like this one). The level starts us in a derelict building, where I traverse broken pipe to make it over to one rooftop and then another. I eventually hit the ground after taking advantage of a zipline, and make my way down to a river, where I scurry past a number of enemies to the end spot beneath a building.

AMD Radeon R9 280X - Crysis 3 (1920x1080)

AMD Radeon R9 280X - Crysis 3 (2560x1440)

AMD Radeon R9 280X - Crysis 3 (4800x900)

AMD Radeon R9 280X - Crysis 3 (5760x1080)

In the first two games we tested, AMD’s card kept quite a bit ahead (30~40%) of NVIDIA’s, but that buck stops here with Crysis 3. AMD’s card still comes out ahead, but only by 10~20% – scaling fairly well to the 20% price increase.

That all said, at 1080p, either of these cards handles the game great. For higher than 1080p, a beefier GPU is going to be required to enjoy the game at these detail settings.


For those who appreciate racing games that are neither too realistic nor too arcade-like, there’s GRID. In GRID 2 (review), the ultimate goal is to build a racing empire, starting from square one. Unlike most racing titles that have some sort of career, the goal here isn’t to earn cash, but fans. Whether you’re racing around Abu Dhabi’s Yas Marina or tearing through a gorgeous Cote d’Azur coastline, your goal is simple: To impress.

GRID 2 - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

GRID 2 - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Manual Run-through: The track chosen for my benchmarking is Miami (Ocean Drive). It’s a simple track overall, which is one of the reasons I chose it, and also the reason I choose to do just a single lap (I crash, often, and that affects both the results and my patience). Unlike most games in the suite which I test twice over (save for an oddity in the results), I race this one lap three times over.

AMD Radeon R9 280X - GRID 2 (1920x1080)

AMD Radeon R9 280X - GRID 2 (2560x1440)

AMD Radeon R9 280X - GRID 2 (4800x900)

AMD Radeon R9 280X - GRID 2 (5760x1080)

At 1080p, AMD’s card came quite a bit ahead; the results were much tighter with the rest of the resolutions, however. At 1440p, the performance is suitable enough, but not ideal. Higher than that, the AMD card handled 4800×900 decently enough, but again it wasn’t ideal. For 5760×1080, lower settings or a beefier GPU will be required.

Game Tests: Metro Last Light, Sleeping Dogs

Crysis has become infamous for punishing even top-end systems, but let’s be fair: The Metro series matches, if not exceeds its requirement for graphical horsepower. That was proven by the fact that we used Metro 2033 in our testing for a staggering three years – only to be replaced by its sequel, Last Light. I’m not particularly a fan of this series, but I am in awe of its graphics even at modest settings.

Metro Last Light - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

Metro Last Light - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Manual Run-through: Because this game is a real challenge to benchmark with for both the reasons of variability in the results and the raw challenge, I choose to use the built-in benchmark here. Unfortunately, this benchmark doesn’t match some of the more intensive parts of the game, especially with levels such as “The Chase”, so please bear that in mind.

AMD Radeon R9 280X - Metro Last Light (1920x1080)

AMD Radeon R9 280X - Metro Last Light (2560x1440)

AMD Radeon R9 280X - Metro Last Light (4800x900)

AMD Radeon R9 280X - Metro Last Light (5760x1080)

Continuing the now-established trend, both of our GPUs here handle the game at 1080p fine (at least in this benchmark; some parts of the game will see much lower FPS than what’s reported here), while both also pass rather well through 1440p. I am sure it will come as no surprise, but for some multi-monitor action, you’ll need even beefier cards to ensure optimal performance. Metro truly is a ruthless punisher when it comes to GPUs.

Sleeping Dogs

Many have called Sleeping Dogs (our review) the “Asian Grand Theft Auto“, but the game does a lot of things differently that helps it stand out of the crowd. For example, in lieu of supplying the player with a gazillion guns, Sleeping Dogs focuses heavily on hand-to-hand combat. There are also many collectibles that can be found to help upgrade your character and unlock special fighting abilities – and if you happen to enjoy an Asian atmosphere, this game should fit the bill.

Sleeping Dogs - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

Sleeping Dogs - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Manual Run-through: The run here takes place during the chapter “Amanda”, on a dark, dank night. The saved game begins us at the first apartment in the game (in North Point), though that’s not where I begin capturing the framerate. Instead, I first request our motorcycle from the garage. Once set, I begin recording the framerate and drive along a specific path all the way to Aberdeen, taking about two minutes.

AMD Radeon R9 280X - Sleeping Dogs (1920x1080)

AMD Radeon R9 280X - Sleeping Dogs (2560x1440)

AMD Radeon R9 280X - Sleeping Dogs (4800x900)

AMD Radeon R9 280X - Sleeping Dogs (5760x1080)

Yet again, you can get excellent performance in this game from either AMD’s or NVIDIA’s card at 1080p with maxed-out detail settings (save for the rather pointless maximum AA setting). Even 2560×1440 is suitable, though I admit this is one game where the closer you are to 60 FPS, the better the experience is. That said, for multi-monitor, lower settings will be necessary for improved performance, though you might be able to suffice with 4800×900 on the AMD card if you are not that fussy.

Game Tests: Splinter Cell: Blacklist, Total War: SHOGUN 2

Tom Clancy is responsible for a countless number of video games, but his Splinter Cell series has become something special, with each game released having been considered “great” overall. The latest in the series, Blacklist, is no exception, and thankfully for us, its graphics are fantastic, and not to mention intensive. For those who love a stealth element in their games, this is one that shouldn’t be skipped.

RIP, Tom Clancy.

Splinter Cell Blacklist - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

Splinter Cell Blacklist - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Manual Run-through: From the start of the ‘Safehouse’ level in Benghazi, Libya, we progress through until we reach an apartment building that must be entered – this is where we end the FPS recording.

AMD Radeon R9 280X - Splinter Cell: Blacklist (1920x1080)

AMD Radeon R9 280X - Splinter Cell: Blacklist (2560x1440)

AMD Radeon R9 280X - Splinter Cell: Blacklist (4800x900)

AMD Radeon R9 280X - Splinter Cell: Blacklist (5760x1080)

At the low-end, AMD’s card keeps 15% ahead of NVIDIA’s; at the best, 27%. 1080p = ideal on both; higher, AMD’s card gets the obvious nod.

Total War: SHOGUN 2

Strategy games are well-known for pushing the limits of any system, and few others do this as well as Total War: SHOGUN 2. It fully supports DX11, has huge battlefields to oversee with hundreds or thousands of units, and a ton of graphics options to adjust. It’s quite simply a beast of a game.

Total War: SHOGUN 2 - 1920x1080 Single Monitor
1920×1080 (1 Monitor)

Total War: SHOGUN 2 - 5760x1080 Triple Monitor
5760×1080 (3×1 Monitors)

Manual Run-through: SHOGUN 2 is one of the few games in our suite where the built-in benchmark is opted for. Strategy games in particular are very difficult to benchmark, so this is where I become thankful to have the option of using a built-in benchmark.

AMD Radeon R9 280X - Total War: SHOGUN 2 (1920x1080)

AMD Radeon R9 280X - Total War: SHOGUN 2 (2560x1440)

AMD Radeon R9 280X - Total War: SHOGUN 2 (4800x900)

AMD Radeon R9 280X - Total War: SHOGUN 2 (5760x1080)

Wrapping up our real-world testing, AMD’s card comfortably comes out once again. Up to this point, NVIDIA’s card hasn’t managed to surpass AMD’s in any test – not entirely surprising given the $50 price premium on the AMD card, but even so, for NVIDIA to not come within 10% of AMD’s card speaks volumes about AMD’s attractive price-point here.

Synthetic Tests: Futuremark 3DMark, 3DMark 11, Unigine Heaven 4.0

We don’t make it a point to seek out automated gaming benchmarks, but we do like to get a couple in that anyone reading this can run themselves. Of these, Futuremark’s name leads the pack, as its benchmarks have become synonymous with the activity. Plus, it does help that the company’s benchmarks stress PCs to their limit – and beyond.


While Futuremark’s latest GPU test suite is 3DMark, I’m also including results from 3DMark 11 as it’s still a common choice among benchmarkers.

AMD Radeon R9 280X - Futuremark 3DMark

AMD Radeon R9 280X - Futuremark 3DMark 11 - Performance

AMD Radeon R9 280X - Futuremark 3DMark 11 - Extreme

For the most part, we saw AMD’s card outperform NVIDIA’s by 10~30% in our real-world and timedemo testing; 3DMark puts the advantage at 25%, while 3DMark 11 puts it at 18%.

Unigine Heaven 4.0

Unigine might not have as established a name as Futuremark, but its products are nothing short of “awesome”. The company’s main focus is its game engine, but a by-product of that is its benchmarks, which are used to both give benchmarkers another great tool to take advantage of, and also to show-off what its engine is capable of. It’s a win-win all-around.

Unigine Heaven 4.0

The biggest reason that the company’s “Heaven” benchmark is so relied-upon by benchmarkers is that both AMD and NVIDIA promote it for its heavy use of tessellation. Like 3DMark, the benchmark here is overkill by design, so results are not going to directly correlate with real gameplay. Rather, they showcase which card models can better handle both DX11 and its GPU-bogging features.

AMD Radeon R9 280X - Unigine Heaven 4.0 (1920x1080)

AMD Radeon R9 280X - Unigine Heaven 4.0 (2560x1440)

AMD’s card gains a 24% advantage at 1080p, but a 27% one at 1440p – both are fantastic gains given the performance boost % exceeds the price boost %.

Temperatures, Power & Final Thoughts

To test graphics cards for both their power consumption and temperature at load, we utilize a couple of different tools. On the hardware side, we use a trusty Kill-a-Watt power monitor which our GPU test machine plugs into directly. For software, we use Futuremark’s 3DMark to stress-test the card, and AIDA64 to monitor and record the temperatures.

To test, the general area around the chassis is checked with a temperature gun, with the average temperature recorded. Once that’s established, the PC is turned on and left to site idle for ten minutes. At this point, AIDA64 is opened along with 3DMark. We then kick-off a full suite run, and pay attention to the Kill-a-Watt when the test reaches its most intensive interval (GT 1) to get the load wattage.

AMD Radeon R9 280X - Temperatures

AMD Radeon R9 280X - Power Consumption

On the temperatures front, both GPUs might as well be considered equals. It’s worth noting though that NVIDIA’s card utilized a reference cooler, which typically don’t perform as well as those found on vendor cards. AMD’s card, by contrast, utilizes a much more efficient-looking cooler (seen below). Given this, it seems we might be able to say that AMD’s GPU runs hotter than NVIDIA’s, but it’s hard to settle on that until we get other cards in to test.

PowerColor Radeon R9 280X

Where power draw is concerned, both cards idle about the same, but AMD’s is a little more power hungry at full-tilt – hardly a suprise given the 10~30% gain in performance.

Final Thoughts

It’s a little unfortunate that I was unable to include the R9 270X and R7 260X in this article, because it’d be easier to get a grasp on the entire lineup. Alas, the fact that I had just revised our GPU test suite and decided to test 4 different resolutions sure didn’t help with the time factor. Nor did the fact that I received the cards on Friday. Such is the life sometimes, living in Canada, behind seemingly the world’s worst delivery services.

That said, I am glad we started-off with the R9 280X, because it’s quite a good product. At the moment, NVIDIA’s GeForce GTX 760 retails for $250; AMD’s Radeon R9 280X retails for $300. AMD’s card in every single one of our tests surpassed the performance of NVIDIA’s card by at least 10% – the average was 20%, matching the scaling in pricing. I’m confident in saying that given the performance of both cards, the 280X is priced right.

AMD Radeon R9 280X

The unfortunate thing about the R9 280X – alright, the entire lineup underneath the 290 – is that it’s based on last year’s silicon. The R9 280X is in effect an HD 7970 GHz Edition. That makes things a little boring here, outside of pricing. While it’s not too uncommon to see GPU vendors rehash a silicon across a generation, it’s not common to see it affect almost the entire lineup.

That said, it’s going to be pricing that makes these cards most interesting. As mentioned above, the pricing of the R9 280X makes it quite attractive compared to the GTX 760, and the only option NVIDIA has to beat out the 280X is the +$100 GTX 770. Simply put, AMD knew it had to keep price-competitive, and at least with the R9 280X, it certainly has. Even on the power consumption front, AMD does quite well, so I have nary a real complaint.

Should you jump on the R9 280X right this moment? It’s not a bad idea, but a better one at this point in time might be to keep an eye out for deals on the HD 7000 series which are inevitably going to plummet in price to be cleared out. It’s also worth noting that AMD’s latest GPUs do not qualify for its Never Settle promotion (a major downer, if you ask me), so there are some definite trade-offs here.

It’ll be interesting to see how NVIDIA counteracts AMD here, because at least on the 280X front, the GTX 760 no longer looks like a homerun card. NVIDIA does have price-scaling on its side, however.

Copyright © 2005-2021 Techgage Networks Inc. - All Rights Reserved.