Latest News Posts

Latest Forum Posts

AMD Radeon R9 290X & NVIDIA GeForce GTX 780 Ti Review
Bookmark and Share

AMD Radeon R9 290X and NVIDIA GeForce 780 Ti
by Rob Williams on March 3, 2014 in Video Cards

More often than not, every battle in the GPU Wars is hotly contested. From performance to appraisals of value, AMD and NVIDIA always engage in apparent mortal combat with each generation of GPU. This current gen of GPU, though, sees a clear-cut winner in most catagories. So did Team Red win, or did Team Green? Read on to find out!


At the moment, both AMD’s and NVIDIA’s top enthusiast offerings hover around the $700 price-point, and as the title of this article suggests, I’m going to be taking a look at both of them in some depth here.

But before I go further, there are a couple of things that need tackling. First, the GTX 780 Ti is not NVIDIA’s highest-end offering; that distinction belongs to the TITAN Black. However, that card isn’t targeted at the same sort of consumer that the 780 Ti is; those who opt for TITAN Black will be using multiple monitors (or a 4K monitor) and perhaps be able to take advantage of the vastly-improved double-precision performance the card offers. So, ignoring TITAN Black entirely, both AMD’s and NVIDIA’s top-end cards cost about $700; and thus begins the theme for this article.

There’s a reason I didn’t use the word “versus” in the title. The suggested retail price for AMD’s R9 290X is $549, but that’s been inflated to ~$700 in response to coin-mining enthusiasts snatching up AMD’s GCN-based cards as if one lucky card contained a map to the fountain of youth. Etailers don’t want to run out of cards (or, perhaps more accurately, they want to price gouge given the insatiable desire for the cards), and so actual gamers that happen to be AMD fans are the ones feeling the pain.

The fact of the matter is, though, both cards I’ll be taking a look at here can be had right now for about $700. So while I’m being careful to not call this a “versus” article, it can be taken as such. Just bear in mind that if SRP prices were kept in-tact, this would be a $549 vs. $700 head-to-head.

AMD Radeon R9 290X and NVIDIA GeForce GTX 780 Ti - Glamour Pose

From the red team, we have the Radeon R9 290X. On paper, this card’s biggest advantage over its main competitor is a 4GB framebuffer – something that those running big resolutions are going to like the look of. Further, being part of the Volcanic Islands series, the R9 290X supports True Audio and the Mantle API. Admittedly, both of these technologies are in their infancy, but both have good potential to make a splash when game developers jump on them down the road.

Carried-over features include support for Eyefinity (multi-monitor), CrossFire (multi-GPU), HD3D (3D gaming), PowerTune (advanced power control), and ZeroCore (ultra-low power consumption when idle). Unique to the R9 290/X is the ability to use CrossFire without connecting the cards with a bridge.

Another feature that’s almost invisible without being told about it is the “Uber” and “Quiet” mode switch. In the default Uber mode, the card’s fan settings allow the card to ramp up very noticeable noise levels when stressed, whereas the Quiet mode relaxes the fan values to be quieter with the caveat of the card running hotter. I’ll talk more about the temperatures situation on this card later.

It’s not too often that a GPU vendor will release a new model that completely disrupts the position of another one in the same lineup, but AMD did just that when it released the R9 290. At $399 (SRP), this card costs $150 less than the 290X. The difference between them? The R9 290X has 10% more cores and a 53MHz higher core clock. In this respect, the R9 290 is a no-brainer… that’s a lot of savings for so little loss. Due to the same inflation the R9 290X has suffered, the R9 290 can be had for about $150 above SRP, with some models available for about $550.

AMD Radeon Series Cores Core MHz Memory Mem MHz Mem Bus TDP
Radeon R9 290X 2816 1000 4096MB 5000 512-bit 250W
Radeon R9 290 2560 947 4096MB 5000 512-bit 250W
Radeon R9 280X 2048 <1000 3072MB 6000 384-bit 250W
Radeon R9 270X 1280 <1050 2048MB 5600 256-bit 180W
Radeon R9 270 1280 <925 2048MB 5600 256-bit 150W
Radeon R9 265 1024 <925 2048MB 5600 256-bit 150W
Radeon R7 260X 896 <1100 2048MB 6500 128-bit 115W
Radeon R7 260 768 <1000 1024MB 6000 128-bit 95W
Radeon R7 250X 640 <1000 1024MB 4500 128-bit 95W
Radeon R7 250 384 <1050 1024MB 4600 128-bit 65W

Due to reference PCIe power limits, both AMD and NVIDIA avoid releasing a card with a stated TDP higher than 250W, and so, that’s where both the R9 290X and GTX 780 Ti sit. However, it doesn’t take much imagination to realize that true TDPs would be higher if these vendors were a bit more honest – it’s nonsensical that three cards with wildly varying specs share the exact same TDP (and not to mention, don’t share the same power draw numbers when stressed).

One thing that might be worth mentioning is that for both of AMD’s and NVIDIA’s current lineups, only the top-tier cards in the lineup feature a new or improved architecture, although all include updated features (which are in effect added through the drivers). On AMD’s side, the R9 290 and R9 290X are the only cards based on a brand-new architecture, Hawaii. For NVIDIA, everything below the GTX 770 is based on the previous generation of GK10X GPUs.

Speaking of NVIDIA, let’s see where the 780 Ti settles itself into the green team’s lineup:

NVIDIA GeForce Series Cores Core MHz Memory Mem MHz Mem Bus TDP
GeForce GTX TITAN Black 2880 889 6144MB 7000 384-bit 250W
GeForce GTX TITAN 2688 837 6144MB 6008 384-bit 250W
GeForce GTX 780 Ti 2880 875 3072MB 7000 384-bit 250W
GeForce GTX 780 2304 863 3072MB 6008 384-bit 250W
GeForce GTX 770 1536 1046 2048MB 7010 256-bit 230W
GeForce GTX 760 1152 980 2048MB 6008 256-bit 170W
GeForce GTX 750 Ti 640 1020 2048MB 5400 128-bit 60W
GeForce GTX 750 512 1020 2048MB 5000 128-bit 55W
GeForce GTX 660 960 980 2048MB 6000 192-bit 140W
GeForce GTX 650 384 1058 1024MB 5000 128-bit 64W

I mentioned above that the release of AMD’s R9 290 disrupted the overall appeal of the R9 290X, but NVIDIA’s lineup suffered the same sort of thing when the 780 Ti came along; after all, how tempting is a $1,000 GPU that gets outperformed by a $700 one? For most enthusiasts, a TITAN would have been hard to justify, but the model itself was of course still justified by those who needed such a massive framebuffer and / or improved double-precision performance. NVIDIA corrected this issue with the TITAN Black, which about matches the 780 Ti in gaming performance, and still offers the other perks that helps set the card apart.

While the 700 series has introduced a bunch of tweaks under the hood, none of them result in clear features that a gamer might want to take advantage of. Fortunately for NVIDIA, it’s offered a rich featureset on its cards for some time; the carried-over features here includes support for CUDA (compute enhancements; apps need to support it explicitly), PhysX (advanced physics in games which supports the API), SLI (multi-GPU), 3D Vision (3D gaming), Surround (multi-monitor), Adaptive VSync (vertical sync with vastly reduced tearing), GPU Boost 2.0 (automatic boosts to the GPU clock if the card’s not hitting its peak temperature), and two technologies I’ve become a particular fan of: GameStream (the ability to stream a game from your PC to the SHIELD handheld), and ShadowPlay (game recording with virtually no performance hit).

Which Offers the Better Experience?

If I had to choose either the 780 Ti or 290X based on their featuresets alone, I’d have to tip my hat towards NVIDIA’s offering. Over the years, I’ve used both AMD’s and NVIDIA’s cards extensively, and I’ve come to find myself preferring NVIDIA’s featureset, and also its drivers. This is especially true on the multi-monitor front; I’ve found configuring multiple monitors on AMD’s cards to be an exercise in patience, whereas on NVIDIA, it’s an absolute breeze in comparison.

None of that is to say that NVIDIA’s multi-monitor implementation is perfect, as some oddities can still arise, but none of those oddities have matched the hassles I’ve experienced with AMD’s cards. In particular, while testing both the 290X and 780 Ti out in multi-monitor configurations, I experienced an issue with AMD where the driver refused to place my taskbar on the center display; instead, it kept it on the right one. This glitch required me to do a thorough uninstall of the driver.

There are other features NVIDIA offers that keeps me tied to its cards, but some or all of them might not matter to you. In particular, I appreciate the ability on NVIDIA’s cards to be able to configure games on a per-title basis via the driver (Note: I managed to overlook the fact that this can be done in AMD’s driver as well; thanks to Sean from the comments for pointing it out), something I couldn’t really live without at this point. Borderlands, for example, does not have a VSync option, which results in obvious tearing during gameplay. With NVIDIA’s Control Panel, I’m able to force Vsync through the driver. Another example is with my classic MMO Asheron’s Call. It offers no anti-aliasing options in-game, but with NVIDIA’s driver, I can force it.

AMD Radeon R9 290X and NVIDIA GeForce 780 Ti

Then, there’s ShadowPlay, the technology that allows you to record gameplay with a minimal hit to performance. As I mentioned above and elsewhere before, this is a technology that I’ve just about fallen in love with, because it works incredibly well, and is very convenient. Because ShadowPlay can record continually, you’re able to save video to your hard drive even if you had no intention to originally (useful if something awesome happens and you’d like to relive the experience). This is a feature I’ve taken advantage of three times in the past week. While some ShadowPlay competitors also offer continual recording, their performance hit is generally very noticeable (Fraps, for example, locks the framerate).

I feel that ShadowPlay is such a good feature, AMD must have its own solution in the works. I think that it kind of has to. With the unbelievable growth of online game streaming, gamers who partake in such activities are going to be attracted to the GPU that can vastly reduce the load off of their CPU to record their gameplay. And don’t forget: ShadowPlay doesn’t lock in the game’s framerate like Fraps does, so during gameplay, you’re not likely to realize it’s recording.

I’ve harped quite a bit on NVIDIA’s pluses, so what about AMD’s? Admittedly, in terms of features that can be taken advantage of right now, there’s not a whole lot to talk about. Both AMD and NVIDIA offer multi-monitor and multi-GPU support, for example, and other things like 3D gaming, overclocking flexibility, and power-related technologies.

That being said, Mantle could become a literal game-changer down the road, though it’s not likely to apply to R9 290X owners too much unless the PC the card’s installed in uses a very modest CPU, since Mantle’s biggest goal is easing the load off of the CPU. True Audio is another piece of technology that some could come to prefer once games start supporting it.

Of course, we can’t ignore one of the reasons AMD’s cards are flying off the shelves lately: The GCN architecture proves far more efficient at cryptocurrency mining, and it performs better than NVIDIA at other similar tasks as well, such as with Folding@home. If either of those things are important to you, AMD can’t be beat (for now, at least).

Ultimately, chances are you could go with either AMD or NVIDIA and not notice missing features. Personally, I like NVIDIA’s featureset enough to keep one of its cards in my personal rig, but you might not see value in the same features I do. From a stability standpoint, while I’ve experienced more hassles with AMD’s cards and drivers (even recently), I wouldn’t expect that the regular user would run into a roadblock. When you install a GPU every other day, you’re bound to run into more issues than is normal.

With all of that said, let’s take a quick look at our test system configuration and methodology, and then get into the important stuff: Gaming performance.

  • Sean

    You can configure games on a per-title basis on AMD as well. It is located underneath 3D applications. Just add the title you want to modify and select your customizations. You can add V-Sync, Anisotropic Filtering, multiple types of AA, Tesselation, etc.

    • Rob Williams

      Cheers! Either I’ve managed to overlook that page all this time, or the ability to optimize on a per-title basis is a fairly recent addition. I knew about the ability to tweak on a global basis, but not per-app.

      • Sean

        Yeah, it’s been in there since before the 7970 launch if I remember correctly. AMD just did a poor Job of advertising it. While I do agree that Nvidia have the better driver suite, AMD have definitely come a long way. They’re still not quite as seamless of an experience as Nvidia but they have started to lessen the gap.

  • Casecutter

    Well the only reason not to buy a 290X over Nvidia is re-sale down the road. Nobody will want your 290X if you/they can’t have assurance that it wasn’t in some 24/7 Litecoin mining rig.

    The only reason your interpretation has merit is the lop-sided price equation. It like imagining all these cards had traditionally only been for mining and they (Nvidia/AMD) amassed there products for only hashing. So $700 for AMD or Nvidia for the amount of work/W. Although now these “gamers” come along and use it to play graphically intensive games, is there a 25% difference to have Nvidia product produce FPS then judge it only form a mining /hashing perceptive?
    If you could get a nice AIB cooler version of a 290X for what normally by this time would see pricing in the $520-530 (if all was right with the world) against the normal obstinate Nvidia still holding to MSRP and hardly then a rebate, this is a totally different article. Nvidia is just lucky that AMD got hit by mining, they don’t need to now compete on price (like they ever do). When asking 25% more for what is at best not even marginally more compelling offering in performance… why is Nvida getting away charging so much in the first place. Could/Would Nvidia be competing with their some 30% larger larger die? It’s sad that Nvidia get to ride safely into the “Kepler Sunset” and not needing to compete price-wise against Hawaii, because if they had I don’t think they could’ve made money selling full-fledged GK110 at anything close to $500.

    Hawaii as a chip on its’ own merits (not its cooler or inflated price) is a more cost effective designed product for gaming, it’s just that mining which it also the superior product wants it more.

    • shadus

      I don’t think Nvidia really cares as much about the high end market consumers as everyone thinks they do. Sure from a publicity stand point they want the most powerful card but I don’t think they want to be considered cost effective in that product range. Now the 750 Ti seems to be more indicative of Nvidia overall strategy. You have a power efficient chip set offered at a reasonable price aimed at the average consumer plus you throw in game stream, g sync and shadow play as additional features. Hopefully data mining doesn’t inflate it price either.

      • Rob Williams

        As strange as it might seem, the 750 Ti is one of the most impressive GPUs that has come out in a while, I feel. Sure, high-end cards are nice and all, but they’re just brute force. The 750 Ti on the other hand offers a ton of performance and doesn’t even require a power connector. That’s impressive.

    • Rob Williams

      “Nobody will want your 290X if you/they can’t have assurance that it wasn’t in some 24/7 Litecoin mining rig.”

      I couldn’t agree more, and I suppose that’s something I should have added in the article. I’d never purchase a GPU (or CPU, for that matter) if I knew it lived that kind of stressful life. While I’m sure AMD and NVIDIA would claim that these cards could be stressed as such if run at reference clocks, I don’t have as much faith.

      “Nvidia is just lucky that AMD got hit by mining”

      You said it.

      “Why is Nvida getting away charging so much in the first place?”

      Because it can, much like how Intel can get away with charging $1,000 for a CPU that could be performance-matched by overclocking a $300 one. In this particular match-up, I’d personally feel inclined to shell out the extra $150 for NVIDIA’s card; it’s not just because of the preformance boost (which is admittedly minor), but a featureset I’ve come to appreciate a lot (ShadowPlay being the biggest one, but also things like GameStream). It’d be a hard premium to pay, but over time, I’d appreciate opting for a card that best suits me. That said, unless I had cash raining on me, I wouldn’t go that high-end to begin with. A more interesting match-up to me would be 780 vs. 290, but I don’t have the latter. And we’d still run into the same caveat list thanks to coin miners; 290 would only be in that matchup because of the inflation. From a pure performance standpoint, AMD would win this round if SRP pricing was kept intect.

      Thanks for the detailed comment!

      • Casecutter

        “Because it can”… They’re so lucky they “can” ride AMD coat-tails right now, because if this was a “normal gaming market” we all realize an AMD 290X could be seeing $500-530 pricing easy. Nvidia’s and it’s AIB’s would have a hard time reducing a 780Ti much below its MSRP of $650.

        Hawaii was sized and meant to “go to War” with the GK110, but this “mining craze” side-tracked AMD’s assault. AMD sized the die to counter Nvidia costly chip, while Nvidia thought AMD couldn’t obtain what they did with that size die. It was hinted when Nvidia scrapped the original GK100 the redesign was to increase the die-size. They couldn’t control the heat even with the cooler they had designed, even brought about the original idea to implement Boost for it, and even then they couldn’t control it. When Nvidia no longer had a big chip they brought up the GK104 and implemented Boost, which got it to compete/best Tahiti. That’s when Nvidia was spinning the whole “we expected more from Tahiti” and why they finally got the GTX680 really on the shelves some 5 month later. Both companies got caught with new architecture on a new processes, even TSMC came up short first out of the gate. A bunch of things they all claimed it/they would provide flop and floundered.

        AMD with Hawaii it’s said to be smaller than the original GK100, and so yes very hard to move that thermal dissipation from that small a surface area. AMD really stretch the boundary of 28nm production vs. cost, and in the process found that they could barely contain the beast within the boundaries of what’s a “prerequisite reference rear-exhaust cooler” they figured for it (they didn’t have time or want to send the money). And, considering Hawaii went from a “design concept” that perhaps “we don’t really need”, to… we see Nvidia actually finding a market for Titan we need to counter in like less than a year.

        Even slapping the Titan cooler to a 290X would prove unwieldy, except for noise where Nvidia has lock-on their specific and costly radial fan design that does provide lower noise. Even still the Nvidia cooler wouldn’t work all that much different, it would still run hot the fan would spin more and be louder, though not like AMD’s.

        If this article had been two MSI Gaming versions, and the AMD had a price of $520 and the Nvidia had a price of $670 there would be a whole different perspective with 25% between them.

        • Rob Williams

          I can’t disagree on a single point, very well said.

          “If this article had been two MSI Gaming versions, and the AMD had a price of $520 and the Nvidia had a price of $670 there would be a whole different perspective with 25% between them.”

          That’s just it. It’d be a hard sell for NVIDIA at $150 more, but when the cards are equal, or even if NVIDIA’s card costs $50 more, the Ti becomes a little more attractive among the two. Of course, if noise / heat were ruled out thanks to a R9 290 sporting a better cooler, then NVIDIA’s pluses are lessened (I still lean towards the NVIDIA side though because I like its featureset and driver, but obviously not everyone cares about the same things I do).

          As it is, the price inflation AMD’s lineup is suffering is bad for basically everyone involved. It’s obviously a problem for gamers looking to purchase a card, and it’s also rough on AMD itself given the points you mentioned. And for reviewers, it’s really hard to review a new card that we -know- costs way more than it’s actually meant to. AMD just released the R9 280, and I’ll be taking a look at one soon, but it’s going to be a hard card to sum-up, knowing that the pricing might not be stable once it launches.

          I cannot wait for this mine craze to pass.

          • Casecutter

            “I cannot wait for this mine craze to pass.” I’ll second that…
            Remember AMD did “all Gamers” a favor even if they complain the reference was hot and loud. I’ll take it as healthy competition! If it wasn’t for Hawaii you wouldn’t have an article to write, and Nvidia might be asking $1200 for the 780Ti … Because they could!.

      • Blockchains

        Just thought I’d pitch in as it’s been a while since the article was written: I did buy an r9 290 for cheap that was used in mining, and my other 2 r9 290′s were used for mining for 5 months straight. No problems here what-so-ever. All cards are working absolutely great.

        That being said I’m glad people aren’t picking them up as quickly as they otherwise might because it means I can get more cards for ultra cheap. Going for my fourth one soon, it would seem.

        • Rob Williams

          Are you using the cards for mining mostly, or also for gaming? That’s some serious horsepower.

          • Blockchains

            Not much mining anymore, but I’m pushing triple 1440p (7680×1440) so I need as much horsepower as I can get, haha. Mining hasn’t been profitable for a while as ASICs have demolished the Scrypt GPU mining scene. Few people are still hanging on by mining X11 coins though.

            I’m doing gaming, game development, and some rendering with LuxRender (these cards are *incredibly* fast at rendering. It’s a shame there’s no proper OpenCL support in Blender Cycles, but this seems to be on AMD’s side.)

          • Rob Williams

            I run a single 1440p monitor but LOVE the idea of expanding that to 3 (room prevents it). How are you finding most games to handle that kind of massive resolution? Have you run smaller 3×1 setups before? I am just wondering if any games that run well at 5760×1080 might have an issue with a resolution with twice the number of pixels (I don’t mean performance, but technical issues).

            Your setup is genuinely drool-worthy. I love it when people go all-out, but it seems like you actually have reasons for it.

            Sorry for such a slow response, I didn’t see the notification until now.

          • Ross Bishop

            I’m using a 290X I got from eBay for £250. It had no specification, but I think it was in a mining rig. Came with the box, took it out, great nick. It was only 2 months old when I got it, seller had great feedback.

            Chucked it in my system with a £70 WC setup (Kraken G10/H75) and it works like a dream. Absolutely chuffed. Everything runs at max at 40FPS or above (avg) @ 2560×1440, most titles being 60, it’s just Crysis 3 really.

            I cannot see why anyone would want a 780Ti when you can get this value for money. As you can see in this article, modern benchmarks prove these two cards have negligible performance differences between them and yet there is a HUGE price disparity.

            We now have Game DVR which seems to be fairly similar to ShadowPlay so the Nvidia guys have… Game Stream… And PhysX? Better cooling on reference and lower power consumption. Second point is null, it’s a high end gaming rig I really don’t care about power, cooling I fixed for a small price (you can go cheaper but I wanted it to be dual slot).

          • Blockchains

            Yeah it’s amazing to me that someone would buy a 780 Ti when you could just get two r9 290′s for less at the moment, while getting near double the performance. The only issue is the cooler. Other than that, fantastic cards.

          • Rob Williams

            TrueAudio is something we’re in the middle of thoroughly researching (an article is in the works). It really, REALLY needs developer support to pick up though.

          • Blockchains

            Yup. I’m hoping middleware developers will take a greater interest in it. Apparently AMD also put something very similar to a TrueAudio block in the PS4, which could make things interesting.


          • Rob Williams

            Great comments, thanks a ton. I agree with you for the most part; a lot has changed since this article was written back in March (I don’t think AMD’s video recorder existed at that time, and that’s a solution I need to test out soon). I am currently under a mountain of content, so I’d say I’d like to follow-up to this soon, but it’s unlikely to happen. It’s definitely something I’d like to do, though.

          • Vriff

            I’d love to see an aftermarket cooler 290x vs aftermarket cooler 780 ti.

          • Rob Williams

            Same. That might be a little hard at this point in time though, since rumors peg the GTX 800 series release at September. Soon enough people will be looking at THOSE cards. I hope AMD won’t take too long to follow-up as well.

          • Blockchains

            It’s basically what you’d get with 4k, but a bit slower, perhaps.
            I can *try* to match your benchmarks but with my resolution and test, but it seems very, very close to pure 4k. I believe it’s 10 MP vs 8 MP for triple 1440 vs 4k.

            These benchmarks mimic essentially what I’ve been getting, but I only currently have two cards in the machine so I can’t give you 3 or 4-way results.


            Just wish they included the frame times, haha.

            Anyway, here’s some screenshots of what I’m getting in some games:

            Crysis 3: Getting ~25 fps with AA disabled @ tri-1440p and 2 cards:


            Splinter Cell Black List:


            These are with WSGS’ widescreen fixer program, which hacks some games to accept different FOVs / different aspect ratios.

            It can look fairly bad without proper FOV settings, in Far Cry 3, for example:


            The extra horizontal resolution is probably not as useful as this setup would be in portrait mode, but I don’t currently have a stand, and two of the three panels (2X u2711′s and 1X u2713HM). This is also my first eyefinity setup. Before this configuration I was driving two 1440p displays and a 22″ 1680×1050 display with a single GTX 260 core 216, haha. Bit of a step up. (Couldn’t even get a speedup in rendering from it over an old quad core :/ )

            For your setup, have you thought of having a display on either side in portrait? Also, AMD eyefinity can now handle displays of different resolutions, sizes, etc. Much more versatile with the latest beta drivers, from what I can tell.

          • Rob Williams

            “Crysis 3: Getting ~25 fps with AA disabled @ tri-1440p and 2 cards:”

            The sick thing is that even if you doubled the number of GPUs in your rig, and things scaled perfectly according to that, that’d STILL be less than 60 FPS – and here you have AA disabled! That’s some serious workload.

            Thanks a ton for those screenshots, looks quite similar to our resolution of 5760×1080 for the most part. Interestingly, none of the games I currently benchmark require any fixes – they all support multi-monitor perfectly. The only game I actually play that has issues is Borderlands 2 – but thankfully that’s only limited to the menus.

            As far as portrait goes, I admit I don’t really like it. I don’t like bezels to begin with, and portrait just makes things worse. To me, you have one of the best possible setups going – large monitors and a wide, huge resolution. I can’t imagine what’d be better than that. Performance is crap of course, or at least hard to make up for, but I’d rather deal with that than have larger 1080p monitors where the pixel widths are so large. The real problem is the $$$ involved in not just getting the monitors, but the GPU horsepower to make for a smooth experience.

          • Blockchains

            Oh yes. However, that is Crysis 3… Back in ’08 I was getting roughly the same performance, but on 1x 1080p display, and in Crysis 1 and with a GTX 260 core 216. Though, it’s definitely still a huge improvement. ;)

            BF4 numbers look A lot better though.

            I’ve found a few games had somewhat weird FOV issues, and that I only noticed they looked better when I used the widescreen fixer app. I know I had serious problems with Mirror’s Edge before I used it. There could have been other factors screwing with these settings as well, however.

            At one point I was *sort of* playing with the idea of adding another 2 x 1440p displays, but it just wouldn’t work for development purposes at all, and thus a negative. I already have trouble utilizing the third display, as photoshop, level editing software, and modeling software do *not* span across screens well, especially when you have huge bezels, haha :p

          • Rob Williams

            Crysis is a tank, that’s for sure. I forget if it was 1080p or 1440p, but I once tossed a couple GPUs into the rig just so I cold set the game to Very High and get playable framerates, and damn, it’s just gorgeous with those settings. As for BF4, that doesn’t surprise me. I could run BF4 at ‘best playable’ with a GTX 760 at 1440p, but Crysis absolutely required 1080p:


            5×1 is just extreme unless you’re into financial monitoring or something. And in games, you’d just be adding an even greater GPU requirement (if the games would even work right at all… I’m betting most wouldn’t – I did run BF BC2 at 5×1 once though and it was actually pretty good).

            You have a photo of your setup?

          • Blockchains

            Yeah and you end up with double the bezels, fewer acceptable use cases, and a brutal upgrade schedule if you want to play games at reasonable settings in the future. When I was more heavily into analyzing cryptocurrency markets having additional screens did help though, as you suggested. Having five displays could be helpful for that, but you’d have to split the vertical space in half to make reasonable use of it in portrait. The other issue is the IPS aspect. If you monitor financial data, it’s just better to get cheaper TN panels, haha. Anyway, at that point you’re seeing some seriously diminishing returns, though.

            Anyway, as for long term goals: At some point I might switch over to some huge 4k TV as a display, at least for certain applications. Other interests for future, insane fun is a 24/7 phase-change cooling system (thinking like 6-10 years in the future here, though, haha.). Since it’s so loud, I was thinking it may be possible to simply have it in a different room, and run the primary loop through the walls. I think this may develop into quite the hobby. :|

            Currently I’m moving the whole system into the basement, so it’s a bit too messy to take a picture, but I will in a few days. Even with just two cards, this thing is producing a stupid amount of heat at idle. Even in the basement it raises the temperature a fair bit, and can even get rather warm if I game for a few hours.

          • Rob Williams

            This tardiness is just embarrassing.

            I agree with you on everything, and love your forward-thinking. I’m not as stoked about 4K as some are though… it’s just more pixels that the GPU has to churn. I favor your 3×1 1440p way, way over a single 4K.

          • Blockchains

            Here’s the pics:

            (Currently using an old dining room table, haha. My usual desk is actually a bit bigger. :/)

            Case is also a bit dusty. Can’t wait to change it out for something better though. (Was thinking an Obsidian 900D, but a friend of mine is recommending the Cooler Master Stacker… Hard choice.)


          • Rob Williams

            Man, that setup is killer! It kind of looks like you’re a pure headphone user like me? As for dust in the chassis, don’t feel so bad… mine is a lot worse (though to be fair I can’t even see in side of it from where it sits next to the desk ;-)).

            Your monitors look a tad low, are you a shorter person in general, or do you not mind looking down at them a little bit? I recently got a new mount for my 3×1 and at first, I had the displays way too low and it drove me nuts. Had to fix that right away, haha.

            Thanks a ton for the pictures. I wish Disqus let me “save” comments in case I ever felt like doing a “reader PCs” post down the road or something. Glad to meet someone else that’s truly of the Master Race ;-)

          • Blockchains

            I’ve had the Sennheisers for a few years now (HD 598), and they’re by far the best headphones I’ve ever had, haha. That being said I wouldn’t mind beefing up my sound system. Right now I only use it if I get tired of wearing the headphones. Just one more thing on my rather long list of upgrades that I’ll hopefully get to eventually.

            I’m actually about ~6′ 1″, but for some reason I found that by having the screens higher it was actually putting strain on my neck, so I lowered them.

            Anyway, I added you up on steam (different nick). Perhaps we could play a game sometime, haha.

          • Rob Williams

            I’ve always been more of a headphone guy than a speaker guy, though a large part of that might be the fact that I’ve always been an apartment-dweller, and bass tends to penetrate walls. I do love the ‘intimacy’ of the sound with headphones, though.

            I wasn’t sure who that was that added me, but seeing as they were a part of the TG group I figured it was a safe accept ;-)

  • nobodyspecial

    A whole article without mentioning Gsync? Did I miss it? I couldn’t care less about gamestream (spam the world with my gaming vids? who cares), but gsync has the power to change all my games and make my card live longer as it slows down or I ratchet up settings. I won’t be buying a monitor next time without gsync inside. I’d pass until they get IPS etc out with it, but it works now where freesync is vaporware currently.

    I also have my doubts as to AMD every intending on a $400/550 price on 290/290x. I think that was just fake, since I’ve never seen either for sale anywhere near that even at release. They were overpriced from day one, which makes me think they NEVER really were $400/550. The fact that AMD didn’t make any money on GPU’s other than 10mil x $12 (120mil) on consoles means there is a shortage that isn’t due to MINING, it is a shortage due to lack of production that can hit that magical 1ghz (hence PNY saying they can’t get chips at all, and also AMD’s REF retails not doing 1ghz either as tomshardware and others have shown). It also explains the fan getting upped rpm’s right at launch and AMD having to ask people to re-run the benchmarks. All pointing to inability to get 1ghz chips out the door reliably. Maybe I’m being harsh here and they did really intend to sell at $400/550, but at best they changed to higher pricing to help offset the cherry picking they must be doing just to get SOME that do 1ghz. Also probably had to pay companies like Asus to swap fans (as tomshardware said the upped rpm’s causes Asus’ fan to be OUT of spec) at the last minute or perhaps even after they were boxed up. We didn’t find out they couldn’t do 1ghz until sites like toms tested retails and found they dropped to ~750mhz.

    Clearly AMD had some costs here or you’d be making more than 120mil which is covered by consoles only, no GPU profits. AMD says low double digits margins on console chips (and they moved 10mil during the quarter 8mil sold at retail, and 2mil in transit/at MS/Sony doorsteps getting boxed up, roughly speaking), so if cards are selling non-stop, where are profits? Why would AMD price cards at breakeven? Clearly the botched launch cost them money or something right? You also gained no share, so again, shortage. I see no data showing they are selling droves of these things. Their quarterly report doesn’t lie, and they made NOTHING on gpus aside from consoles. I don’t believe there is a mining craze buying these all up, or profits would be UP too. If you’re selling out, YOU MAKE MONEY, unless you are NOT making many at all due to heat, noise, fan re-tooling etc. Nothing else makes sense.

    One last point, there wasn’t much on mantle, or how it’s dead once GDC 2014 hits in 10 days. OpenGL already has 10x savings on calls, just like mantle and DX has it coming soon. Mantle, which runs on very few cards (1% in the market?), will never be used by more than a few games AMD pays for (BF4 etc). They can’t afford to dislodge DX/OpenGL which both have ~15yrs of getting entrenched in the market and work with every engine already. AMD should have spent all mantle resources on GPU/CPU/Drivers/Freesync. Instead we get hot/noisy gpus, cpu race lost, drivers in phase3, and freesync not even a marketable product yet (amd’s words) vs. gsync already here. I could go on, but you should get the point. Never mind what happens when you OC both cards where NV leaves AMD even more. If AMD ever gets 290x down to $550, I’d bet dollars to donuts NV just cuts the price to $600 on 780ti the second it happens.

    Not sure how AMD will battle in the ARM arena either (LTE modem? How do you get into phones without one? Nothing but a server seattle chip this year, how long until a Phone/Tablet soc on ARM comes?). I wouldn’t own their stock. A quick look at the company history shows in the last 10yrs they’ve lost $6 Billion+ and the company over it’s life has never made a dime (down billions of losses). They wouldn’t be in this position if it wasn’t for the overpaid price of ATI (3x what they should have paid). Their debt is directly attributed to the ATI purchase. They shouldn’t have spent any R&D on consoles either, as they are dying (see GDC 2013, wiiu sales, vita, 3ds etc, all down going forward). GDC 2014 will show worse than 2013 did for consoles as mobile is taking over (60% making mobile games, 40% PC, less than 15% for ALL consoles and xbox1/ps4 had less interest than xbox360/ps3!). We need AMD to get bought before they fall further behind in everything so far that they are NOT worth buying at all.

    • Rob Williams

      I meant to respond to this long ago. First off, fantastic comment, thanks for the detail.

      I messed-up not mentioning GSync, I admit, and I’m really unsure as to why it slipped my mind (it’s something I’m looking forward to, as well). I am not entirely sure it’s going to become a “must-have” for everyone, but it’s definitely going to have its fans. I’m hoping to get a GSync monitor in sooner than later to take it for a much longer spin (I’ve only seen it in action at an NVIDIA event).

      As for 290/290X pricing, I admit that I’m not sure -what- it was at launch, because by the time it finally hit e-tail, I forgot to look. I do believe that pricing is intended though, because AMD knew that they’d sell extremely well (and I guess they have regardless… just not to their target audience).

      “I don’t believe there is a mining craze buying these all up, or profits would be UP too.”

      That’s true, but the problem could stem from e-tailers charging out the ass for each card. Gamers who want to buy a card of gaming have limited options. As a regular consumer, I’d end up going NVIDIA even if I were an AMD fan, because I -hate- getting gouged. I’d do with whatever the better value is. In this respect, I don’t think e-tailers are doing AMD much of a favor.

      “One last point, there wasn’t much on mantle, or how it’s dead once GDC 2014 hits in 10 days.”

      I mostly wanted to wait and see what GDC and perhaps GTC would bring. That said, at this point I’m with you. The same applies to where AMD should have put its focus… I couldn’t agree more. I feel even stronger about that fact regarding TrueAudio than I do Mantle – I think Mantle was a potential misstep though. Who buys a graphics card for audio? You’d have to be a huge audio nerd to have that sway your decision.

      Again, very good insight, thanks a ton for the comment. I am sure April 1st will bring many AMD acquisition stories…

      • victor

        I am surprised how you became a TG staff, when you cannot stand in middle without taking sides. Like you have did with Nvidia.

        • Rob Williams

          That’s quite an ignorant statement. What did I say that was inaccurate? I’ve been reviewing graphics cards for over 8 years… I’ve dealt with at least 50 of them. I think I have the right to an opinion.

          • victor

            You have the right to give an opinion no doubt but that opinion should be a honest one. Is the below statement accurate, how can a staff could comment like that,

            “As for 290/290X pricing, I admit that I’m not sure -what- it was at
            launch, because by the time it finally hit e-tail, I forgot to look. I
            do believe that pricing is intended though, because AMD knew that they’d
            sell extremely well (and I guess they have regardless… just not to
            their target audience).”

            Where is your data to back it up. Also what about GTX 780 and GTX Titan launched at 700$ and 1000$ respectively. Was Nvidia not playing monopoly there?

            Your whole comment is inaccurate,shows Nvidia fan boy. Also G-sync is not nuclear technology. It is easy for any other company to get a hold or use it. How do you came to a conclusion that Mantle and True Audio are missteps when they are at the starting stage?

          • Rob Williams

            “Where is your data to back it up.”

            You should have read through the post I was responding to. nobodyspecial said that they have doubts about AMD’s intent to sell the cards at SRP, and I replied that I -did- think the SRPs were intended. The next statement I made should have clued you in; I am not going to claim that AMD would think over-inflated pricing would make the cards “sell extremely well”.

            “Also G-sync is not nuclear technology. It is easy for any other company to get a hold or use it.”

            AMD could produce the same sort of solution, absolutely. But the same can be said about Mantle, since DirectX 12 is supposed to take care of those low-level API needs. That means NVIDIA cards and AMD cards alike would both gain the same performance improvements, and developers would not need to target just one GPU architecture.

            As for TrueAudio, that’s not a feature that interests me. I never cared about EAX, either. I think AMD is dividing its focus too much; gamers don’t choose a GPU because of some audio enhancement. I commend the idea, but AMD doesn’t have as many resources to throw around like NVIDIA does, so I think it should be choosier about the areas it focuses on. Instead of audio, I think AMD should have created a ShadowPlay competitor, because that’s a feature I -do- see people choosing a GPU for (it’s not just for streamers… it’s also for those like me who like to record game video and archive it).

            That all aside, if you read my conclusion, I state that if AMD pricing was kept at SRP levels like they should be, the entire conclusion would dramatically change. My job as a reviewer is to be honest, like you say, and I called it like I saw it in this article. I even went as far to state that it was kind of unfair for me to compare these two cards due to the inflated pricing.

            I am really not sure where you’re sniffing my fanboism from.

            Go read my 750 Ti review where I show that AMD’s current solutions are favorable compared to that card when performance matters. Or the R7 260 review where I rave about AMD’s card that can deliver 1080p gaming at such a good price-point.

          • victor

            I agree. Thanks for clarifying.

          • Nuruddin Peters

            Great rebuttal! You guys are all very-informative!

      • Nuruddin Peters

        This is very enlightening… Thanks for the read!

        I’m a proud owner of Nvidia’s gtx 780ti ACX, while also a huge fan of AMD’s CPU’s for over a decade now. (I don the 8350fx overclocked to 4.4ghz)

        I’m always curious as to what the hell AMD is doing…

        • Rob Williams

          It’s good to see dedicated AMD CPU fans nowadays! And that’s a gorgeous GPU. I love EVGA’s ACX coolers. Thanks for the nice comments!

    • victor

      What is you problem with AMD? Clearly a Nvidia fan. You are a eye closed cat. Price are already down to 549$. Due to huge demand and chinese industrial leave, prices are jacked up. AMD has nothing to do with it. Don’t feed junk to people. Nvidia is always over priced. Yes AMD’s reference cooler is not good. Custom cards solved all the problems mentioned in the article, it is because of the reference cooler they cannot get the standard clock. For 579$ i can get a excellent custom R9 290x. Also for an idiot like you, Nvidia is the only option.

  • victor

    This isn’t the best way to compare R9 290x. Comparision should be between custom cards, something like Asus R9 290x dc2 and Asus 780 ti DC2. We all know that AMD’s reference cooler is bad, there is no point in comparing it. If you have done that, R9 290x would have got the nod. Also Asus R9 290x dc2 is selling 599$ in Newegg. Except Sapphire all the cards have went down in price. I don’t know where you have got this price tag? Powercolor custom card is selling for 579$. Equipped with 4GB RAM and performance almost equal to 780 Ti. Heat, Loud and power issues are nullified with the custom cards. Don’t use a reference card for comparing a AMD GPU next time. It is not showing the real value.

    • Rob Williams

      Do you mean don’t show AMD cards with reference coolers when it doesn’t work out to its favor? I consider showing a versus between two cards with reference coolers to be completely fair when they’re available to the public that way (not to mention, that’s how they were sent to us by AMD/NVIDIA). If I were to test an aftermarket cooler on the AMD, I would have done so on the NVIDIA as well. Alas, all I have are the cards with their reference coolers.

      I wrote this article at the start of March, and at the start of March, the least-expensive 290X I could find was nowhere near $579.

      • victor

        I just said using an aftermarket cooler for R9 290x almost nullifies all the issues you have mentioned in this article. I have never seen a GPU that have done so much better with custom cooler. In this particular comparison, using a custom card would have shown us the real value.

        • Rob Williams

          I agree 100%, but as I said, cards with reference coolers for both vendors ARE being sold, so I still consider this fair. It’s not as though I went out and picked-up a card with an unfavorable cooler… this is the exact card AMD sent me. This entire argument wouldn’t exist if the 290X had a cooler to match the one on the higher-end 700 series.

          • victor

            True AMD should have packed a better cooler with 290x.

  • elesi168

    I don’t know much about computers but I am about to buy a new laptop and the gpu that I am debating on is a gtx 870m or r9 m290x. Based on your review, I can see that the r9 is a good card. I was wondering if your r9 290x is exactly the same as an r9 m290x on a laptop? and if so, will it be better than an 870m? Thank you.

    • Rob Williams

      Desktop GPUs cannot be compared to the mobile ones; as a general rule the mobile cards that share the same name as the desktop part will be half as fast (or less). Nonetheless, the AMD card you chose should be faster than the NVIDIA one. The m290X would be best compared to the 880M.