Techgage logo

AMD Radeon R9 290X & NVIDIA GeForce GTX 780 Ti Review

Date: March 3, 2014
Author(s): Rob Williams

More often than not, every battle in the GPU Wars is hotly contested. From performance to appraisals of value, AMD and NVIDIA always engage in apparent mortal combat with each generation of GPU. This current gen of GPU, though, sees a clear-cut winner in most catagories. So did Team Red win, or did Team Green? Read on to find out!



Introduction

At the moment, both AMD’s and NVIDIA’s top enthusiast offerings hover around the $700 price-point, and as the title of this article suggests, I’m going to be taking a look at both of them in some depth here.

But before I go further, there are a couple of things that need tackling. First, the GTX 780 Ti is not NVIDIA’s highest-end offering; that distinction belongs to the TITAN Black. However, that card isn’t targeted at the same sort of consumer that the 780 Ti is; those who opt for TITAN Black will be using multiple monitors (or a 4K monitor) and perhaps be able to take advantage of the vastly-improved double-precision performance the card offers. So, ignoring TITAN Black entirely, both AMD’s and NVIDIA’s top-end cards cost about $700; and thus begins the theme for this article.

There’s a reason I didn’t use the word “versus” in the title. The suggested retail price for AMD’s R9 290X is $549, but that’s been inflated to ~$700 in response to coin-mining enthusiasts snatching up AMD’s GCN-based cards as if one lucky card contained a map to the fountain of youth. Etailers don’t want to run out of cards (or, perhaps more accurately, they want to price gouge given the insatiable desire for the cards), and so actual gamers that happen to be AMD fans are the ones feeling the pain.

The fact of the matter is, though, both cards I’ll be taking a look at here can be had right now for about $700. So while I’m being careful to not call this a “versus” article, it can be taken as such. Just bear in mind that if SRP prices were kept in-tact, this would be a $549 vs. $700 head-to-head.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Glamour Pose

From the red team, we have the Radeon R9 290X. On paper, this card’s biggest advantage over its main competitor is a 4GB framebuffer – something that those running big resolutions are going to like the look of. Further, being part of the Volcanic Islands series, the R9 290X supports True Audio and the Mantle API. Admittedly, both of these technologies are in their infancy, but both have good potential to make a splash when game developers jump on them down the road.

Carried-over features include support for Eyefinity (multi-monitor), CrossFire (multi-GPU), HD3D (3D gaming), PowerTune (advanced power control), and ZeroCore (ultra-low power consumption when idle). Unique to the R9 290/X is the ability to use CrossFire without connecting the cards with a bridge.

Another feature that’s almost invisible without being told about it is the “Uber” and “Quiet” mode switch. In the default Uber mode, the card’s fan settings allow the card to ramp up very noticeable noise levels when stressed, whereas the Quiet mode relaxes the fan values to be quieter with the caveat of the card running hotter. I’ll talk more about the temperatures situation on this card later.

It’s not too often that a GPU vendor will release a new model that completely disrupts the position of another one in the same lineup, but AMD did just that when it released the R9 290. At $399 (SRP), this card costs $150 less than the 290X. The difference between them? The R9 290X has 10% more cores and a 53MHz higher core clock. In this respect, the R9 290 is a no-brainer… that’s a lot of savings for so little loss. Due to the same inflation the R9 290X has suffered, the R9 290 can be had for about $150 above SRP, with some models available for about $550.

AMD Radeon SeriesCoresCore MHzMemoryMem MHzMem BusTDP
Radeon R9 290X281610004096MB5000512-bit250W
Radeon R9 29025609474096MB5000512-bit250W
Radeon R9 280X2048<10003072MB6000384-bit250W
Radeon R9 270X1280<10502048MB5600256-bit180W
Radeon R9 2701280<9252048MB5600256-bit150W
Radeon R9 2651024<9252048MB5600256-bit150W
Radeon R7 260X896<11002048MB6500128-bit115W
Radeon R7 260768<10001024MB6000128-bit95W
Radeon R7 250X640<10001024MB4500128-bit95W
Radeon R7 250384<10501024MB4600128-bit65W

Due to reference PCIe power limits, both AMD and NVIDIA avoid releasing a card with a stated TDP higher than 250W, and so, that’s where both the R9 290X and GTX 780 Ti sit. However, it doesn’t take much imagination to realize that true TDPs would be higher if these vendors were a bit more honest – it’s nonsensical that three cards with wildly varying specs share the exact same TDP (and not to mention, don’t share the same power draw numbers when stressed).

One thing that might be worth mentioning is that for both of AMD’s and NVIDIA’s current lineups, only the top-tier cards in the lineup feature a new or improved architecture, although all include updated features (which are in effect added through the drivers). On AMD’s side, the R9 290 and R9 290X are the only cards based on a brand-new architecture, Hawaii. For NVIDIA, everything below the GTX 770 is based on the previous generation of GK10X GPUs.

Speaking of NVIDIA, let’s see where the 780 Ti settles itself into the green team’s lineup:

NVIDIA GeForce SeriesCoresCore MHzMemoryMem MHzMem BusTDP
GeForce GTX TITAN Black28808896144MB7000384-bit250W
GeForce GTX TITAN26888376144MB6008384-bit250W
GeForce GTX 780 Ti28808753072MB7000384-bit250W
GeForce GTX 78023048633072MB6008384-bit250W
GeForce GTX 770153610462048MB7010256-bit230W
GeForce GTX 76011529802048MB6008256-bit170W
GeForce GTX 750 Ti64010202048MB5400128-bit60W
GeForce GTX 75051210202048MB5000128-bit55W
GeForce GTX 6609609802048MB6000192-bit140W
GeForce GTX 65038410581024MB5000128-bit64W

I mentioned above that the release of AMD’s R9 290 disrupted the overall appeal of the R9 290X, but NVIDIA’s lineup suffered the same sort of thing when the 780 Ti came along; after all, how tempting is a $1,000 GPU that gets outperformed by a $700 one? For most enthusiasts, a TITAN would have been hard to justify, but the model itself was of course still justified by those who needed such a massive framebuffer and / or improved double-precision performance. NVIDIA corrected this issue with the TITAN Black, which about matches the 780 Ti in gaming performance, and still offers the other perks that helps set the card apart.

While the 700 series has introduced a bunch of tweaks under the hood, none of them result in clear features that a gamer might want to take advantage of. Fortunately for NVIDIA, it’s offered a rich featureset on its cards for some time; the carried-over features here includes support for CUDA (compute enhancements; apps need to support it explicitly), PhysX (advanced physics in games which supports the API), SLI (multi-GPU), 3D Vision (3D gaming), Surround (multi-monitor), Adaptive VSync (vertical sync with vastly reduced tearing), GPU Boost 2.0 (automatic boosts to the GPU clock if the card’s not hitting its peak temperature), and two technologies I’ve become a particular fan of: GameStream (the ability to stream a game from your PC to the SHIELD handheld), and ShadowPlay (game recording with virtually no performance hit).

Which Offers the Better Experience?

If I had to choose either the 780 Ti or 290X based on their featuresets alone, I’d have to tip my hat towards NVIDIA’s offering. Over the years, I’ve used both AMD’s and NVIDIA’s cards extensively, and I’ve come to find myself preferring NVIDIA’s featureset, and also its drivers. This is especially true on the multi-monitor front; I’ve found configuring multiple monitors on AMD’s cards to be an exercise in patience, whereas on NVIDIA, it’s an absolute breeze in comparison.

None of that is to say that NVIDIA’s multi-monitor implementation is perfect, as some oddities can still arise, but none of those oddities have matched the hassles I’ve experienced with AMD’s cards. In particular, while testing both the 290X and 780 Ti out in multi-monitor configurations, I experienced an issue with AMD where the driver refused to place my taskbar on the center display; instead, it kept it on the right one. This glitch required me to do a thorough uninstall of the driver.

There are other features NVIDIA offers that keeps me tied to its cards, but some or all of them might not matter to you. In particular, I appreciate the ability on NVIDIA’s cards to be able to configure games on a per-title basis via the driver (Note: I managed to overlook the fact that this can be done in AMD’s driver as well; thanks to Sean from the comments for pointing it out), something I couldn’t really live without at this point. Borderlands, for example, does not have a VSync option, which results in obvious tearing during gameplay. With NVIDIA’s Control Panel, I’m able to force Vsync through the driver. Another example is with my classic MMO Asheron’s Call. It offers no anti-aliasing options in-game, but with NVIDIA’s driver, I can force it.

AMD Radeon R9 290X and NVIDIA GeForce 780 Ti

Then, there’s ShadowPlay, the technology that allows you to record gameplay with a minimal hit to performance. As I mentioned above and elsewhere before, this is a technology that I’ve just about fallen in love with, because it works incredibly well, and is very convenient. Because ShadowPlay can record continually, you’re able to save video to your hard drive even if you had no intention to originally (useful if something awesome happens and you’d like to relive the experience). This is a feature I’ve taken advantage of three times in the past week. While some ShadowPlay competitors also offer continual recording, their performance hit is generally very noticeable (Fraps, for example, locks the framerate).

I feel that ShadowPlay is such a good feature, AMD must have its own solution in the works. I think that it kind of has to. With the unbelievable growth of online game streaming, gamers who partake in such activities are going to be attracted to the GPU that can vastly reduce the load off of their CPU to record their gameplay. And don’t forget: ShadowPlay doesn’t lock in the game’s framerate like Fraps does, so during gameplay, you’re not likely to realize it’s recording.

I’ve harped quite a bit on NVIDIA’s pluses, so what about AMD’s? Admittedly, in terms of features that can be taken advantage of right now, there’s not a whole lot to talk about. Both AMD and NVIDIA offer multi-monitor and multi-GPU support, for example, and other things like 3D gaming, overclocking flexibility, and power-related technologies.

That being said, Mantle could become a literal game-changer down the road, though it’s not likely to apply to R9 290X owners too much unless the PC the card’s installed in uses a very modest CPU, since Mantle’s biggest goal is easing the load off of the CPU. True Audio is another piece of technology that some could come to prefer once games start supporting it.

Of course, we can’t ignore one of the reasons AMD’s cards are flying off the shelves lately: The GCN architecture proves far more efficient at cryptocurrency mining, and it performs better than NVIDIA at other similar tasks as well, such as with [email protected] If either of those things are important to you, AMD can’t be beat (for now, at least).

Ultimately, chances are you could go with either AMD or NVIDIA and not notice missing features. Personally, I like NVIDIA’s featureset enough to keep one of its cards in my personal rig, but you might not see value in the same features I do. From a stability standpoint, while I’ve experienced more hassles with AMD’s cards and drivers (even recently), I wouldn’t expect that the regular user would run into a roadblock. When you install a GPU every other day, you’re bound to run into more issues than is normal.

With all of that said, let’s take a quick look at our test system configuration and methodology, and then get into the important stuff: Gaming performance.

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our test-bed specifications, but also a detailed look at how we conduct our testing.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the GPU. Each card used for comparison is also listed here, along with the driver version used.

 Graphics Card Test System
ProcessorsIntel Core i7-4960X – Six-Core @ 4.50GHz
MotherboardASUS P9X79-E WS
MemoryKingston HyperX Beast 32GB (4x8GB) – DDR3-2133 11-12-11
GraphicsAMD Radeon R9 290X 4GB – Catalyst 13.12 (GPU-Z)
NVIDIA GeForce GTX 780 Ti 3GB – GeForce 331.93 (GPU-Z)
AudioOnboard
StorageKingston HyperX 240GB SSD
Power SupplyCooler Master Silent Pro Hybrid 1300W
ChassisCooler Master Storm Trooper Full-Tower
CoolingThermaltake WATER3.0 Extreme Liquid Cooler
DisplaysASUS PB278Q 27″ 2560×1440
Dell P2210H 22″ 1920×1080 x 3
Et ceteraWindows 7 Professional 64-bit

Notes About Our High-end System

The goal of our performance content is to show you as accurately as possible how one product compares to another – after all, you’re coming to us for advice, so we want to make sure we’re giving you the best possible information. Typically, one major step we take in ensuring that our performance results are accurate is to make sure that our test systems are void of all possible bottlenecks, so for that, high-end components must be used.

In the case of our graphics card test system, the processor chosen has six-cores and is overclocked far beyond reference clocks. Most games nowadays are not heavily CPU-bound, but by using such a chip, we feel that we completely rule it out as a potential bottleneck. The same can be said for the use of an SSD (as opposed to latency-ridden mechanical storage), and even our memory, which is clocked at the comfortable speed of DDR3-2133.

Why this matters to you: Our test PC is high-end, and it’s very likely that you’d encounter a bottleneck quicker than us. Our goals are to rid all possible bottlenecks, whereas yours is to build the PC you need. In our case, we need to go overboard to attain as accurate a representation of a graphic card’s performance as possible.

If your PC has at least a modern (~2-years-old) quad-core or better processor, and at least 8GB of fast memory (DDR3-1866+), that chances of you running into a bottleneck with today’s hottest game is admittedly low. If you’re using lower-end gear, you can absolutely expect that the rest of your system could be a bottleneck. It should be noted, though, that if you’re seeking out a lower-end graphics card, the importance of a bottleneck would of course be lessened.

Unfortunately, we’re not able to test a single card on multiple PC configurations; each single card we test takes at least 3 hours to test, with another 2 hours added on for each additional resolution, and at least another 1~2 hours for our Best Playable results (for up to 11 hours of mostly hands-on testing for a high-end model).

Please bear all of this in mind. If you’re unsure if your PC could prove to be a bottleneck, our comments section exists for such questions.

When preparing our test-beds for any type of performance testing, we follow these guidelines:

General Guidelines

To aid with the goal of keeping accurate and repeatable results, we alter certain services in Windows 7 from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing inaccurate test results. For example, disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

The services we disable are:

For further fine-tuning, we also use Windows’ “Classic” desktop theme, which gets rid of the transparency that can sometimes utilize a GPU in the background.

Vendor Favortism

Sometimes, either AMD or NVIDIA will work with a game studio to help their development process along. As history has proven, this often results in a game that is tuned better for one vendor over the other, although sometimes the tides can change over time, resulting in the competing vendor offering the better experience.

One of our goals is to provide as neutral a benchmarking suite as possible, so while it’s impossible to avoid games sponsored by either of these companies, we can at least make an effort to achieve a blended list. As it stands, our current game list and their partners are:

(AMD) – Battlefield 4
(AMD) – Crysis 3
(AMD) – Sleeping Dogs
(NVIDIA) – Assassin’s Creed IV: Black Flag
(NVIDIA) – Metro: Last Light
(NVIDIA) – Splinter Cell Blacklist
(Neutral) – GRID 2
(Neutral) – Total War: SHOGUN 2

With that, let’s move on to a quick look at the game settings we use in our testing:

Assassin’s Creed IV: Black Flag

Assassin's Creed IV Black Flag Benchmark Settings

Battlefield 4

Battlefield 4 Benchmark Settings

Note: The “High” preset is used for multi-monitor configurations.

Crysis 3

Crysis 3 Benchmark Settings

Crysis 3 Benchmark Settings

Note: The “Medium” preset is used for multi-monitor configurations.

GRID 2

GRID 2 Benchmark Settings

GRID 2 Benchmark Settings

GRID 2 Benchmark Settings

Metro Last Light

Metro Last Light Benchmark Settings

Sleeping Dogs

Sleeping Dogs Benchmark Settings

Sleeping Dogs Benchmark Settings

Splinter Cell Blacklist

Splinter Cell Blacklist Benchmark Settings

Splinter Cell Blacklist Benchmark Settings

Total War: SHOGUN 2

Total War SHOGUN 2 Benchmark Settings

Unigine Heaven

Unigine Heaven 4 Benchmark Settings

Game Tests: Assassin’s Creed IV: Black Flag, Battlefield 4

Given the sheer number of titles in the Assassin’s Creed series, it’s a little hard to believe that the first game came out a mere six years ago. You could definitely say that Ubisoft hit the ball out of the park with this one. To date, we’ve never considered an AC game for benchmarking, but given the number of graphical goodies featured in the PC version of Black Flag, that trend now ends.

Assassin's Creed IV Black Flag - 1920x1080

Manual Run-through: The saved game starts us not far from the beginning of the game under a small church which can be climbed to synchronize with the environment. To kick things off, I scale this church and rotate the camera around once, making sure to take in the beautiful landscape; then, I climb back down and run all the way to the water (the top of this small church and the water can be seen in the above screenshot).

Note: For some reason, Ubisoft decided to cap the framerate to 60 FPS in Black Flag even if Vsync is turned off. For most games, this would ruin the chance of it appearing in our benchmarking, but because the game is graphically intensive, I’ve chosen to stick with it, as at higher resolutions, reaching 60 FPS is a perk that will belong only to high-end graphics cards.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Assassin's Creed IV: Black Flag (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Assassin's Creed IV: Black Flag (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Assassin's Creed IV: Black Flag (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Assassin's Creed IV: Black Flag (5760x1080)

While 1080p might be a de facto resolution, it’s almost an insult to run it on a GPU like the 290X or 780 Ti. In fact, even 1440p proved to be no issue to either card – and believe me, at maxed-out settings, this game is gorgeous. When we move into multi-monitor territory, we can begin to see a change; 4800×900 performance would be suitable enough for most, but at 5760×1080, a decrease or two will be necessary to bring the game back up to ~60 FPS.

Battlefield 4

Thanks to the fact that DICE cares more about PC gaming than a lot of developers, the Battlefield series tends to give us titles that are well-worth benchmarking. Battlefield 3 offered incredible graphics and became a de facto benchmark immediately, so it’s no surprise, then, that BF4 follows right in its footsteps.

Battlefield 4 - 1920x1080

Manual Run-through: The Singapore level is the target here, with the saved game starting us on an airboat that must be driven to shore, where a massive battle is set to take place. I stop recording the framerate once the tank makes its way to the end of this small patch of beach; in all, the run takes about 3 minutes.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Battlefield 4 (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Battlefield 4 (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Battlefield 4 (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Battlefield 4 (5760x1080)

Once again, 1080p is an absolute breeze for both GPUs, and in this particular case, the same can be said for our multi-monitor resolution of 4800×900. For most games, 1080p is the easiest resolution to render (of those we test), while 4800×900 and 1440p place about the same. Then there’s 5760×1080, which tends to be in a league of its own. However, with Battlefield 4, 2560×1440 and 5760×1080 perform just about the same – definitely not something I see happen too often.

Game Tests: Crysis 3, GRID 2

When the original Crysis dropped in late 2007, it took no time at all for pundits to coin the phrase, “Can it run Crysis?“, almost to the point of self-parody. At the time, the game couldn’t have its graphics detail maxed-out on even top-of-the-line PCs, and in reality, that’s a great thing. I’d imagine few are opposed to knowing that a game could actually look better down the road as our PCs grow into them. As the series continued, Crytek knew it had a legend to live up to, and fortunately, Crysis 3 (our review) lives up to the original’s legacy.

Crysis 3 - 1920x1080 Single Monitor

Manual Run-through: There’s no particular level in Crysis 3 that I could establish was “better” for benchmarking than another, but I settled on “Red Star Rising” based on the fact that I could perform a run-through with no chance of dying (a great thing in a challenging game like this one). The level starts us in a derelict building, where I traverse a broken pipe to make it over to one rooftop and then another. I eventually hit the ground after taking advantage of a zipline, and make my way down to a river, where I scurry past a number of enemies to the end spot beneath a building.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Crysis 3 (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Crysis 3 (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Crysis 3 (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Crysis 3 (5760x1080)

As 1080p, there’s an ample delta between the two cards, but as the resolutions are increased, that delta tightens quick. At 5760×1080, the difference is a mere 2 FPS for the average, and 5 FPS for the minimum. It’s important to note, though, that while multi-monitor performance was very good on both cards, I use the “Medium” preset for those; at “High”, the framerates will definitely suffer.

GRID 2

For those who appreciate racing games that are neither too realistic nor too arcade-like, there’s GRID. In GRID 2 (review), the ultimate goal is to build a racing empire, starting from square one. Unlike most racing titles that have some sort of career, the goal here isn’t to earn cash, but fans. Whether you’re racing around Abu Dhabi’s Yas Marina or tearing through a gorgeous Cote d’Azur coastline, your goal is simple: To impress.

GRID 2 - 1920x1080 Single Monitor

Manual Run-through: The track chosen for my benchmarking is Miami (Ocean Drive). It’s a simple track overall, which is one of the reasons I chose it, and also the reason I choose to do just a single lap (I crash, often, and that affects both the results and my patience). Unlike most games in the suite which I test twice over (save for an oddity in the results), I race this one lap three times over.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - GRID 2 (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - GRID 2 (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - GRID 2 (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - GRID 2 (5760x1080)

In a bit of a shakeup, AMD surpassed NVIDIA slightly at 5760×1080, while NVIDIA won the other rounds. Overall, though, 55~57 FPS at 5760×1080 with maximum detail? It’s hard to complain about that.

Game Tests: Metro Last Light, Sleeping Dogs

Crysis has become infamous for punishing even top-end systems, but let’s be fair: The Metro series matches, if not exceeds its requirement for graphical horsepower. That was proven by the fact that we used Metro 2033 in our testing for a staggering three years – only to be replaced by its sequel, Last Light. I’m not particularly a fan of this series, but I am in awe of its graphics even at modest settings.

Metro Last Light - 1920x1080 Single Monitor

Manual Run-through: Because this game is a real challenge to benchmark with for both the reasons of variability in the results and the raw challenge, I choose to use the built-in benchmark here but rely on Fraps to give me more accurate results.

Note: Metro Last Light‘s built-in benchmark is not representative of the entire game; some levels will punish a GPU much worse than this benchmark will (namely, “The Chase”, which has lots of smoke and explosions). What this means is that while these settings might suffice for much of the game, there might be instances where the performance degrades enough during a certain chapter or portion of a chapter to force a graphics setting tweak.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Metro Last Light (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Metro Last Light (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Metro Last Light (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Metro Last Light (5760x1080)

The performance similarities between these two cards in this title is almost eerie. We might as well call them exact.

Sleeping Dogs

Many have called Sleeping Dogs (our review) the “Asian Grand Theft Auto“, but the game does a lot of things differently that helps it stand out of the crowd. For example, in lieu of supplying the player with a gazillion guns, Sleeping Dogs focuses heavily on hand-to-hand combat. There are also many collectibles that can be found to help upgrade your character and unlock special fighting abilities – and if you happen to enjoy an Asian atmosphere, this game should fit the bill.

Sleeping Dogs - 1920x1080 Single Monitor

Manual Run-through: The run here takes place during the chapter “Amanda”, on a dark, dank night. The saved game begins us at the first apartment in the game (in North Point), though that’s not where I begin capturing the framerate. Instead, I first request our motorcycle from the garage. Once set, I begin recording the framerate and drive along a specific path all the way to Aberdeen, taking about two minutes.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Sleeping Dogs (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Sleeping Dogs (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Sleeping Dogs (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Sleeping Dogs (5760x1080)

Despite being an AMD-targeted game, NVIDIA’s card performs very well here. And, it’s worth noting that a GeForce driver that came out after this testing was conducted promised a performance boost of “up to 18%” in this particular title.

Game Tests: Splinter Cell: Blacklist, Total War: SHOGUN 2

Tom Clancy is responsible for a countless number of video games, but his Splinter Cell series has become something special, with each game released having been considered “great” overall. The latest in the series, Blacklist, is no exception, and thankfully for us, its graphics are fantastic, and not to mention intensive. For those who love a stealth element in their games, this is one that shouldn’t be skipped.

RIP, Tom Clancy.

Splinter Cell Blacklist - 1920x1080 Single Monitor

Manual Run-through: From the start of the ‘Safehouse’ level in Benghazi, Libya, we progress through until we reach an apartment building that must be entered – this is where we end the FPS recording.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Splinter Cell: Blacklist (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Splinter Cell: Blacklist (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Splinter Cell: Blacklist (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Splinter Cell: Blacklist (5760x1080)

In a repeat of what we saw earlier, NVIDIA comes out ahead in each resolution here except for 5760×1080 – but few would take a difference of 1 FPS too seriously. At the single display resolutions, both cards run the game beautifully.

Total War: SHOGUN 2

Strategy games are well-known for pushing the limits of any system, and few others do this as well as Total War: SHOGUN 2. It fully supports DX11, has huge battlefields to oversee with hundreds or thousands of units, and a ton of graphics options to adjust. It’s quite simply a beast of a game.

Total War: SHOGUN 2 - 1920x1080 Single Monitor

Manual Run-through: SHOGUN 2 is one of the few games in our suite where the built-in benchmark is opted for. Strategy games in particular are very difficult to benchmark, so this is where I become thankful to have the option of using a built-in benchmark.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Total War: SHOGUN 2 (1920x1080)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Total War: SHOGUN 2 (2560x1440)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Total War: SHOGUN 2 (4800x900)

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Total War: SHOGUN 2 (5760x1080)

SHOGUN 2 continues what’s largely been a trend for the entire article.

Next up, I’ll tackle our “Best Playable” results.

Best Playable: Single Display

For about as long as GPU-accelerated games have existed, an ideal performance target has been 60 frames-per-second. Owing thanks to this is the standard 60Hz monitor, which delivers its best result when the framerate matches its refresh rate. To make sure the monitor’s refresh rate and game’s framerate keep aligned, to avoid visible tearing, VSync should be enabled.

While I believe our Best Playable results will appeal to any gamer, they could especially prove useful to those intrigued by livingroom gaming or console replacements. The goal here is simple: With each game, the graphics settings are tweaked to deliver the best possible detail while keeping us as close to 60 FPS on average as possible.

Because our Metro Last Light and Total War: SHOGUN 2 tests are timedemos, and because this kind of testing is time-consuming, I am sticking to six out of the eight games I test with for inclusion here.

Our regular benchmark tests showed that the R9 290X and 780 Ti perform on an equal level, with NVIDIA getting the slight edge in most tests. That being the case, most of the Best Playable settings are going to be identical between both cards; this is especially true on this page, since we’re dealing with only a single display. Because we’re dealing with just two cards; one from AMD, and the other from NVIDIA, the results here are in alphabetical order like in the rest of our results.

 Assassin’s Creed IV: Black Flag
2560×1440MinimumAverage
AMD Radeon R9 290X5161
Graphics Settings
& Screenshot
Environment:Very HighShadow:High
Texture:HighReflection:High
Anti-aliasing:FXAAGod Rays:High
Ambient Occlusion:SSAOVolumetric Fog:On
Motion BlurOn 
Assassin's Creed IV Black Flag - Best Playable - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti5261
Graphics Settings
& Screenshot
Environment:Very HighShadow:High
Texture:HighReflection:High
Anti-aliasing:FXAAGod Rays:High
Ambient Occlusion:SSAOVolumetric Fog:On
Motion BlurOn 
Assassin's Creed IV Black Flag - Best Playable - NVIDIA GeForce GTX 780 Ti

With AC IV’s foolish design of a 60 FPS cap, it should come as no surprise that we’re able to retain the same settings used in our normal benchmarking and keep achieve playable framerates. Given the results here, neither card gets an edge, though it must be said that NVIDIA’s screenshot has a cat in it. So there’s that.

 Battlefield 4
2560×1440MinimumAverage
AMD Radeon R9 290X4360
Graphics Settings
& Screenshot
Texture Quality:UltraTexture Filtering:Ultra
Lighting:UltraEffects:Ultra
Post Processing:UltraMesh:Ultra
Terrain:UltraTerrain Decoration:Ultra
Anti-aliasing Deferred:2x MSAAAnti-aliasing Post:Medium
Ambient Occlusion:SSAO  
Battlefield 4 - Best Playable - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti4362
Graphics Settings
& Screenshot
Texture Quality:UltraTexture Filtering:Ultra
Lighting:UltraEffects:Ultra
Post Processing:UltraMesh:Ultra
Terrain:UltraTerrain Decoration:Ultra
Anti-aliasing Deferred:2x MSAAAnti-aliasing Post:Medium
Ambient Occlusion:SSAO  
Battlefield 4 - Best Playable - NVIDIA GeForce GTX 780 Ti

Both the R9 290X and 780 Ti might be extremely powerful, but 2560×1440 might still prevent you from topping out ingame settings – such is the case here, with Battlefield 4. Fortunately, the tweaks needed to hit 60 FPS are minimal: Reduce anti-aliasing slightly, and move from HBAO to SSAO.

 Crysis 3
2560×1440MinimumAverage
AMD Radeon R9 290X4161
Graphics Settings
& Screenshot
Anti-aliasing:FXAATexture:High
Effects:HighObject:High
Particles:HighPost Processing:High
Shading:HighShadows:High
Water:HighAnisotropic Filtering:x16
Motion Blur:MediumLens Flares:Yes
Crysis 3 - Best Playable - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti4266
Graphics Settings
& Screenshot
Anti-aliasing:FXAATexture:High
Effects:HighObject:High
Particles:HighPost Processing:High
Shading:HighShadows:High
Water:HighAnisotropic Filtering:x16
Motion Blur:MediumLens Flares:Yes
Crysis 3 - Best Playable - NVIDIA GeForce GTX 780 Ti

As with Assassin’s Creed IV: Black Flag, the “Best Playable” results for Crysis 3 match those from the regular benchmarking. These are not the maximum settings the game can utilize, but for the Ultra profile to be playable at this resolution, multi-GPU is required.

 GRID 2
2560×1440MinimumAverage
AMD Radeon R9 290X6776
Graphics Settings
& Screenshot
Multisampling:4x MSAANight Lighting:High
Shadows:UltraAdvanced Fog:On
Particles:UltraCrowd:Ultra
Cloth:HighAmbient Occlusion:Ultra
Soft Ambient Occlusion:OnGround Cover:High
Vehicle Details:HighTrees:Ultra
Objects:UltraVehicle Reflections:Ultra
Water:HighPost Process:High
Skidmarks:OnAdvanced Lighting:On
Global Illumination:OnAnisotropic Filtering:Ultra
GRID 2 - Best Playable - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti7383
Graphics Settings
& Screenshot
Multisampling:4x MSAANight Lighting:High
Shadows:UltraAdvanced Fog:On
Particles:UltraCrowd:Ultra
Cloth:HighAmbient Occlusion:Ultra
Soft Ambient Occlusion:OnGround Cover:High
Vehicle Details:HighTrees:Ultra
Objects:UltraVehicle Reflections:Ultra
Water:HighPost Process:High
Skidmarks:OnAdvanced Lighting:On
Global Illumination:OnAnisotropic Filtering:Ultra
GRID 2 - Best Playable - NVIDIA GeForce GTX 780 Ti

Once again, the settings used in the normal benchmarking gets carried over here, and overall, the results are impressive for both cards.

 Sleeping Dogs
2560×1440MinimumAverage
AMD Radeon R9 290X5970
Graphics Settings
& Screenshot
Anti-aliasing:HighHigh-res Textures:On
Shadow Resolution:HighShadow Filtering:High
Ambient Occlusion:HighMotion Blur:High
World Density:Extreme 
Sleeping Dogs - Best Playable - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti5471
Graphics Settings
& Screenshot
Anti-aliasing:HighHigh-res Textures:On
Shadow Resolution:HighShadow Filtering:High
Ambient Occlusion:HighMotion Blur:High
World Density:Extreme 
Sleeping Dogs - Best Playable - NVIDIA GeForce GTX 780 Ti

I hate to sound like a broken record, but with Sleeping Dogs, the same settings as used in the normal benchmarking can be retained. It could be noted that AMD’s card proved 5 FPS better at the minimum side of things, but overall both cards offer nearly identical performance.

 Tom Clancy’s Splinter Cell: Blacklist
2560×1440MinimumAverage
AMD Radeon R9 290X5465
Graphics Settings
& Screenshot
Texture Detail:UltraShadow:Ultra
Parallax:OnTessellation:On
Texture Filtering:16xAmbient Occlusion:Field AO & HBAO+
Anti-aliasing:FXAA 
Tom Clancy's Splinter Cell Blacklist - Best Playable - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti5871
Graphics Settings
& Screenshot
Texture Detail:UltraShadow:Ultra
Parallax:OnTessellation:On
Texture Filtering:16xAmbient Occlusion:Field AO & HBAO+
Anti-aliasing:FXAA 
Tom Clancy's Splinter Cell Blacklist - Best Playable - NVIDIA GeForce GTX 780 Ti

I can’t wait to get into the next page, because it’s a little boring to say the same thing over and over. So with that, let’s move onto what’s considered “best” for 5760×1080.

Best Playable: Multi-Display

With the results seen on the previous page, we found that both AMD’s Radeon R9 290X and NVIDIA’s GeForce GTX 780 Ti can handle most of today’s games @ 1440p with great detail just fine. With either card, and that resolution, you can basically expect to top-out a game’s graphics settings upon opening it, and you’ll probably get livable framerates.

But, at 2560×1440, we’re dealing with 3.68 megapixels. What happens if we widen the viewport, and then bump the resolution up to 6.22 megapixels? The goal of this page is to figure that out.

To re-state something said on the previous page: Our regular benchmark tests showed that the R9 290X and 780 Ti perform on an equal level, with NVIDIA getting the slight edge in most tests. That being the case, most of the Best Playable settings are going to be identical between both cards. Because we’re dealing with just two cards; one from AMD, and the other from NVIDIA, the results here are in alphabetical order like in the rest of our results.

 Assassin’s Creed IV: Black Flag
3×1 Monitor (5760×1080)MinimumAverage
AMD Radeon R9 290X4456
Graphics Settings
& Screenshot
Environment:HighShadow:Normal
Texture:HighReflection:Normal
Anti-aliasing:FXAAGod Rays:Low
Ambient Occlusion:OffVolumetric Fog:On
Motion BlurOn 
Assassin's Creed IV Black Flag - Best Playable Multi-Monitor - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti4859
Graphics Settings
& Screenshot
Environment:HighShadow:Normal
Texture:HighReflection:Normal
Anti-aliasing:FXAAGod Rays:Low
Ambient Occlusion:OffVolumetric Fog:On
Motion BlurOn 
Assassin's Creed IV Black Flag - Best Playable Multi-Monitor - NVIDIA GeForce GTX 780 Ti

With this game at 1440p, I was able to stick with the same settings I use in normal benchmarking, but to achieve truly playable framerates at 5760×1080, some detail had to be sacrificed. On both cards, AO was cut, God Rays was decreased to Low, and the Environment detail was reduced from Very High to High. When all said and done, we get fairly close to our goal at 60 FPS; in order to actually hit 60 FPS, detail levels that are nice to have would have to be decreased, and given how well these framerates felt, it would not be worth it.

 Battlefield 4
3×1 Monitor (5760×1080)MinimumAverage
AMD Radeon R9 290X4961
Graphics Settings
& Screenshot
Texture Quality:HighTexture Filtering:High
Lighting:HighEffects:High
Post Processing:HighMesh:High
Terrain:HighTerrain Decoration:High
Anti-aliasing Deferred:OffAnti-aliasing Post:Medium
Ambient Occlusion:SSAO  
Battlefield 4 - Best Playable Multi-Monitor - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti4355
Graphics Settings
& Screenshot
Texture Quality:HighTexture Filtering:High
Lighting:HighEffects:High
Post Processing:HighMesh:High
Terrain:HighTerrain Decoration:High
Anti-aliasing Deferred:OffAnti-aliasing Post:Medium
Ambient Occlusion:HBAO  
Battlefield 4 - Best Playable Multi-Monitor - NVIDIA GeForce GTX 780 Ti

These results are a bit interesting, because it appears that AMD comes out ahead. In actuality, I retained HBAO on the NVIDIA card, something I was unable to do on the AMD card. That does sacrifice some frames, but like with AC IV, I didn’t actually feel the difference in real gameplay. If that magical 60 FPS must be hit, SSAO can be used. Note also that for these framerates, anti-aliasing had to be disabled.

 Crysis 3
3×1 Monitor (5760×1080)MinimumAverage
AMD Radeon R9 290X3755
Graphics Settings
& Screenshot
Anti-aliasing:FXAATexture:Medium
Effects:MediumObject:Medium
Particles:MediumPost Processing:Medium
Shading:MediumShadows:Medium
Water:MediumAnisotropic Filtering:x16
Motion Blur:MediumLens Flares:Yes
Crysis 3 - Best Playable Multi-Monitor - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti4257
Graphics Settings
& Screenshot
Anti-aliasing:FXAATexture:Medium
Effects:MediumObject:Medium
Particles:MediumPost Processing:Medium
Shading:MediumShadows:Medium
Water:MediumAnisotropic Filtering:x16
Motion Blur:MediumLens Flares:Yes
Crysis 3 - Best Playable Multi-Monitor - NVIDIA GeForce GTX 780 Ti

It about makes me want to cry whenever I have to choose “Medium” as a detail level, but in Crysis 3, Medium actually looks how “High” would in many other games. It takes a quick look at either of the screenshots above to verify that. In the end, both cards handled the game very well at these detail levels. We didn’t quite get to the magical 60 FPS, but it’ll be hard to notice in real gameplay – especially with the use of motion blur.

 GRID 2
3×1 Monitor (5760×1080)MinimumAverage
AMD Radeon R9 290X5460
Graphics Settings
& Screenshot
Multisampling:4x MSAANight Lighting:High
Shadows:UltraAdvanced Fog:On
Particles:UltraCrowd:Ultra
Cloth:HighAmbient Occlusion:Ultra
Soft Ambient Occlusion:OnGround Cover:High
Vehicle Details:HighTrees:Ultra
Objects:UltraVehicle Reflections:Ultra
Water:HighPost Process:High
Skidmarks:OnAdvanced Lighting:On
Global Illumination:OffAnisotropic Filtering:Ultra
GRID 2 - Best Playable Multi-Monitor - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti5763
Graphics Settings
& Screenshot
Multisampling:4x MSAANight Lighting:High
Shadows:UltraAdvanced Fog:On
Particles:UltraCrowd:Ultra
Cloth:HighAmbient Occlusion:High
Soft Ambient Occlusion:OnGround Cover:High
Vehicle Details:HighTrees:Ultra
Objects:UltraVehicle Reflections:Ultra
Water:HighPost Process:High
Skidmarks:OnAdvanced Lighting:On
Global Illumination:OffAnisotropic Filtering:Ultra
GRID 2 - Best Playable Multi-Monitor - NVIDIA GeForce GTX 780 Ti

Being that GRID 2 isn’t quite as graphically-impressive as other games in our suite, not much had to be tweaked here. The R9 290X in particular came out ahead, as only Global Illiumination had to be disabled. The 780 Ti, on the other hand, had to utilize that same change, along with a decrease of the Ambient Occlusion from Ultra to High. Modest changes overall.

 Sleeping Dogs
3×1 Monitor (5760×1080)MinimumAverage
AMD Radeon R9 290X5973
Graphics Settings
& Screenshot
Anti-aliasing:NormalHigh-res Textures:On
Shadow Resolution:HighShadow Filtering:High
Ambient Occlusion:HighMotion Blur:High
World Density:Extreme 
Sleeping Dogs - Best Playable Multi-Monitor - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti5178
Graphics Settings
& Screenshot
Anti-aliasing:NormalHigh-res Textures:On
Shadow Resolution:HighShadow Filtering:High
Ambient Occlusion:HighMotion Blur:High
World Density:Extreme 
Sleeping Dogs - Best Playable Multi-Monitor - NVIDIA GeForce GTX 780 Ti

With Sleeping Dogs, there’s very little that can be tweaked graphically – at least, if we’re talking about settings that will dramatically affect framerate. The #1 cause of bogged-down gameplay here is with the anti-aliasing, and that reared its ugly head here. Honestly though, the drop from High to Normal AA is highly unlikely to even be noticed, thanks to the game’s bizarre AA implementation (even Extreme AA shows jaggies on many surfaces). If only an FXAA option were available.

 Tom Clancy’s Splinter Cell: Blacklist
3×1 Monitor (5760×1080)MinimumAverage
AMD Radeon R9 290X5065
Graphics Settings
& Screenshot
Texture Detail:UltraShadow:Ultra
Parallax:OnTessellation:On
Texture Filtering:16xAmbient Occlusion:Field AO
Anti-aliasing:FXAA 
Tom Clancy's Splinter Cell Blacklist - Best Playable Multi-Monitor - AMD Radeon R9 290X
NVIDIA GeForce GTX 780 Ti5371
Graphics Settings
& Screenshot
Texture Detail:UltraShadow:Ultra
Parallax:OnTessellation:On
Texture Filtering:16xAmbient Occlusion:Field AO
Anti-aliasing:FXAA 
Tom Clancy's Splinter Cell Blacklist - Best Playable Multi-Monitor - NVIDIA GeForce GTX 780 Ti

For Blacklist, both cards had to drop HBAO. Admittedly, this doesn’t make what I’d call a drastic difference to the aesthetic (though it’s obvious when compared back-to-back), and when sub-60 FPS framerates are the reality, it’s certainly not worth keeping.

Synthetic Tests: Futuremark 3DMark, 3DMark 11, Unigine Heaven 4.0

We don’t make it a point to seek out automated gaming benchmarks, but we do like to get a couple in that anyone reading this can run themselves. Of these, Futuremark’s name leads the pack, as its benchmarks have become synonymous with the activity. Plus, it does help that the company’s benchmarks stress PCs to their limit – and beyond.

3DMark

While Futuremark’s latest GPU test suite is 3DMark, I’m also including results from 3DMark 11 as it’s still a common choice among benchmarkers.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Futuremark 3DMark

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Futuremark 3DMark 11 - Performance

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Futuremark 3DMark 11 - Extreme

The three different 3DMark tests have a difficult time agreeing on whether it’s the R9 290X or 780 Ti that’s superior. In 3DMark (2013), the 780 Ti gets the smallest of edges, while the Performance test in 3DMark 11 shows the 290X as the better card. Then, things get changed-up once again with the Extreme test in 3DMark 11, with the 780 Ti suddenly in front. Pfft, synthetics. Can’t live with them, can’t live without them.

Unigine Heaven 4.0

Unigine might not have as established a name as Futuremark, but its products are nothing short of “awesome”. The company’s main focus is its game engine, but a by-product of that is its benchmarks, which are used to both give benchmarkers another great tool to take advantage of, and also to show-off what its engine is capable of. It’s a win-win all-around.

Unigine Heaven 4.0

The biggest reason that the company’s “Heaven” benchmark is so relied-upon by benchmarkers is that both AMD and NVIDIA promote it for its heavy use of tessellation. Like 3DMark, the benchmark here is overkill by design, so results are not going to directly correlate with real gameplay. Rather, they showcase which card models can better handle both DX11 and its GPU-bogging features.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Unigine Heaven 4.0 (1920x1080)

The 290X managed a slightly better minimum, but NVIDIA’s strength in tessellation really shines here: 86 vs. 72 FPS. That’s hardly a small difference.

Power & Temperatures, Final Thoughts

To test graphics cards for both their power consumption and temperature at load, we utilize a couple of different tools. On the hardware side, we use a trusty Kill-a-Watt power monitor which our GPU test machine plugs into directly. For software, we use Futuremark’s 3DMark to stress-test the card, and AIDA64 to monitor and record the temperatures.

To test, the general area around the chassis is checked with a temperature gun, with the average temperature recorded. Once that’s established, the PC is turned on and left to site idle for ten minutes. At this point, AIDA64 is opened along with 3DMark. We then kick-off a full suite run, and pay attention to the Kill-a-Watt when the test reaches its most intensive interval (GT 1) to get the load wattage.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Temperatures

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Power Consumption

As with the performance results we’ve seen through this article, NVIDIA edges AMD out here. On the power front, the differences are small enough to call them non-important, but it’s kind of hard to ignore the 91°C peak of the R9 290X. While AMD states that 90°+ is no problem as far as hardware stability goes, it’s not a value that gives me the utmost in confidence – even 80°C is pushing it, and makes liquid cooling look all the more tempting.

There are some things that neither of these charts can show, however, such as noise. The R9 290X in particular not only runs hot, but it runs loud as well. So loud, in fact, that it gave me flashbacks of the GTX 480 – both cards can sound like miniature jet engines while gaming. For those who game without headphones, or at low volume, that sound is going to get annoying, believe me.

AMD understands this as well, which is why the “Quiet” mode switch is included. But, given the fact that this switch is on the card itself, it’s not exactly the most convenient, and past that, it causes the card to run even hotter. That’s hardly ideal.

The thing to bear in mind with this issue though is that it’s exclusive to R9 290Xs that utilize the reference cooler. I’d never expect these sorts of temperature / noise level issues with third-party models. In a matchup between AMD’s R9 290X and NVIDIA’s GTX 780 Ti reference coolers, NVIDIA wins hands-down. R9 290Xs with the reference cooler are common, so I’d recommend steering clear.

Don’t care about noise or heat? Note that another downfall of an inefficient cooler can result: Dropped performance. If the R9 290X does not have a lot of room to move air around – even outside the chassis – it can overheat and result in reduced clocks. Normally, the test PC I use is situated nearly a foot in front of a wall, and in between two desks – something I wouldn’t consider too atypical of a normal setup.

In that configuration, there wasn’t enough room for air to move outside of the chassis, so the 1GHz target clock speed of the 290X sometimes dropped to 970MHz or 937MHz. In the exact same chassis placement, the 780 Ti did not experience this issue; instead, it sometimes went above its normal clock speed. When I hauled the PC out a couple of feet further, the 290X never dropped below 1GHz – and at the same time, it didn’t get quite as loud. Food for thought, especially if you’re planning to go the 290X route, but again, I wouldn’t expect these issues to creep up with third-party cards.

Final Thoughts

Some articles are difficult to wrap-up, and this one is a perfect example. If the R9 290X were priced as it should be, drumming up a conclusion would be a simple affair. Of course, that’s not the case. The R9 290X costs $150 more than it should, and given all we’ve learned through the course of this article, that puts the card at an uncomfortable spot, which means NVIDIA’s ahead. If we all woke up tomorrow and the R9 290X were back to its $549 price-point, though, conclusions would change.

While it almost feels unfair to AMD to even compare these two cards, the fact of the matter is that both retail for around $700 right now. Things could change, but we can’t expect them to. If something happens and coin miners begin unloading their cards en masse, that might make AMD’s offerings look even more attractive than they do right now, since you’d likely be able to score a card for cheap (on the flipside, this wouldn’t be an ideal situation for AMD).

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Domination

At this $700 price-point, it’s hard to argue that NVIDIA comes clean ahead. Things change if we’re talking about mining efficiency, but on every other front, NVIDIA’s package is better: The 780 Ti performs better overall, runs cooler and quieter, and boasts a couple of features that I consider to be awesome (ShadowPlay, for starters). At the moment, AMD’s best unique features are going to take some time to establish themselves, and even then, Mantle is likely to have limited use to someone running a high-end AMD card (because they likely also have a decent CPU).

Taking a look at the flipside, imagine if the R9 290X was being sold at this moment for $549. If that were the case, then it would be a heck of a lot more appealing. Looking at things from that perspective, we’d be talking about a card that costs $150 less than the 780 Ti. The card might then run hotter and louder, but its overall performance would help negate those cons.

At that point, the 780 Ti would still look impressive based on its efficient cooler, lower noise, and robust featureset – but that might not matter to some people. For me, it does, and I’d wager that the extra $150 would be worth it for the 780 Ti. You might disagree, and if you don’t care about NVIDIA’s unique features, you probably would.

At the end of the day, NVIDIA gets the definite nod here – the 780 Ti is simply a fantastic card. AMD’s is no slouch, but it has enough caveats to make its current pricing a major turn-off – unless you’re a coin miner, of course.

AMD Radeon R9 290X

AMD Radeon R9 290X

Pros

Cons

NVIDIA GeForce GTX 780 Ti

NVIDIA GeForce GTX 780 Ti

Pros

Cons

NVIDIA GeForce GTX 780 Ti - Techgage Editor's Choice
NVIDIA GeForce GTX 780 Ti

Copyright © 2005-2020 Techgage Networks Inc. - All Rights Reserved.