Sapphire Radeon R7 260X OC 2GB Graphics Card Review

Sapphire Radeon R7 260X OC
by Rob Williams on February 17, 2014 in Graphics & Displays

Both AMD and NVIDIA are in the midst of launching new budget GPUs, but neither of them affect the position of the Radeon R7 260X – a great thing as I have Sapphire’s take on the model to tackle. With a boost of 50MHz to the core clock, and an improved cooler, should Sapphire’s R7 260X OC be on your sub-$150 shortlist?


In December, I took AMD’s $99 Radeon R7 260 for a spin. Overall, I was left impressed with the 1080p gaming it could deliver, and even went as far as calling it a “console-killer”. Little did I realize at the time, though, AMD wasn’t targeting that particular card at the North American market. In fact, it hasn’t been available on these shores at all since its launch, as far as I can tell. Gee, thanks AMD.

Given the lack of the R7 260 in North America, the cheapest model in AMD’s latest Radeon series is the R7 260X. Or, at least it was up until last week. You see, Sapphire sent me its take on the 260X about a month ago, and while I hoped to get the review up a couple of weeks ago, it couldn’t happen. So of course, AMD decided to release not one, but two new budget Radeons since then. Fortunately for me though, neither of those models affects this one; instead, it just makes the purchasing decision at that price-point a little more interesting.

Sapphire Radeon R7 260X - Overview

As the table below highlights, AMD offers five Radeon models at $150 or under. As we established above, the R7 260 is non-existent on these shores, and because it’s too great of a weak link, let’s just ignore the R7 250. That leaves us with the R7 250X and R7 260X for those looking to spend $99~$120.

Well, that’s if the market prices reflected the SRPs AMD has given out, that is. As of the time of writing, I found a couple R7 250Xs in stock, but all were at least $10 above the SRP, and most were $20 above. Likewise, while the R7 260X was meant to drop to $119 (from $139) given the 250X’s introduction, it hasn’t happened up to this point. This is an issue that’s been plaguing AMD’s entire product-line over the past couple of months thanks to digital coin miners, but it’s hard to imagine that that’s what’s affecting the low-end segment as well.

AMD RadeonCoresCore MHzMemoryMem MHzMem BusTDPPrice
R9 290X281610004096MB5000512-bit250W$549
R9 29025609474096MB5000512-bit250W$399
R9 280X2048<10003072MB6000384-bit250W$299
R9 270X1280<10502048MB5600256-bit180W$199
R9 2701280<9252048MB5600256-bit150W$179
R9 2651024<9252048MB5600256-bit150W$149
R7 260X896<11002048MB6500128-bit115W$119
R7 260768<10001024MB6000128-bit95W$109
R7 250X640<10001024MB4500128-bit95W$99
R7 250384<10501024MB4600128-bit65W$89

For simplicity’s sake, let’s keep the SRPs in mind for a moment. At $99, there’s the R7 250X. Add $10, and it adds 128 cores to give us the R7 260. Add another $10, and it adds another 128 cores to give us the 260X. It’s quite an interesting ladder we have going on there; I kind of wish all product-lines were so linear. It’s worth noting though that the 260X insists on 2GB being standard, and of the three models, it also has the fastest GDDR5 speed.

Mentioned above, I established that the R7 260 was a great card for 1080p gaming, so given the boosts to GDDR and core count, what we should see is similar graphics settings allowances, but overall improved smoothness – the minor jump in cores just isn’t going to be a game-changer (no pun).

What Sapphire does to set this particular R7 260X apart is add 50MHz to the GPU clock (for 1150MHz), and a mere 25MHz to the memory (for 6600MHz effective). Minor enhancements, but most people are not going to complain about a free performance boost. As I’ll be comparing this card against the reference 260X, we’ll see just how far that 50MHz goes.

Sapphire’s take on the R7 260X is an attractive one, with a more advanced cooler versus the reference model (which, not surprisingly, does result in a temperature drop, as we’ll see later).

Sapphire Radeon R7 260X - Video Connectors

Sapphire offers a couple of different R7 260Xs, but it’s this “OC” model that includes dual DVI ports – something that could prove important to those planning on running a 3×1 monitor configuration. In addition to the DVIs, Sapphire offers DisplayPort and HDMI.

Taking a look at the top of the card, a single PCIe power connector is revealed, along with a CrossFire bridge nearer to the back. We also get a little glimpse at the cooler, which makes use of dual heatpipes.

Sapphire Radeon R7 260X - Power and CrossFire Connectors

Overall, the cooler design is pretty simple, as further evidenced in the shot below. It bundles two heatpipes and a bunch of fins and pushes warm air out the back with a quiet fan. The end of the card is a little interesting, as it’s hollowed-out, and the PCB runs right to the end. When’s the last time you saw a graphics card with so much bare PCB space?

Sapphire Radeon R7 260X - Cooler Intake

Not pictured, Sapphire includes a software CD in the package, along with some minor paper materials, a DVI to VGA adapter, and a 4-pin (Molex) to 6-pin PCIe power cable. There’s nothing too exotic here, but we shouldn’t expect anything else at a <$140 pricepoint.

With that all taken care of, let’s move onto a look at our testing methodology, and then jump right into testing.

  • xOptix78

    So how would this card compare to a 7850? Yes, I’m thinking of upgrading if the processing power is there, since I’d like to step up to about a 24″ 1080p-capable monitor.

    • Rob Williams

      Both cards are about even. You’d need to go with at least an R7 270X to have it feel like an upgrade. Unfortunately though, those cards like many others of AMD’s at the moment are priced outside of reason. As sad as it is, any AMD fan wanting to upgrade at the moment is best to go with NVIDIA. Damned miners…

      • xOptix78

        Have you had a chance to get down and dirty with a GTX 750 Ti yet? Or have the samples even gone out yet?

        $150 GPU has me wondering about performance.

        • Rob Williams

          It’s in the test rig, just running behind. Hoping to have a review up by Thursday.

  • The Focus Elf

    Man I still can’t even to get RAGE to play with any decency on my 7970 without artifacts and tearing… =/ This card shows much promise!

    • Rob Williams

      Artifacts? Is the card dying?

      • The Focus Elf

        I hope not, I just think it is RAGE still being a crappy PC port. I don’t have any issues like that with Borderlands II or any of the others that I play… Bottom line though, I think I am leaving ATI on my next build.

        • Rob Williams

          Ahh, that’s too bad about Rage. I feel like I’m going to pick it up on some future Steam sale just because… it’s still an id game.

          As much as I hate to admit it, I’ve had enough issues with AMD cards over the years that I just couldn’t imagine moving off of NVIDIA. But admittedly, I deal with many more cards than the norm, and in much more aggressive ways (benchmarking). I’m bound to run into more than the usual number of issues. But in the end, AMD has given me many more hassles than NVIDIA, so it is what it is.

          • Paul Robertson

            rage plays fine on my 7950 give it a try its cheap carmack made it.

          • Rob Williams

            I’ll definitely be giving it a go at some point. It goes on sale a lot, an you’re right… it’s a Carmack game so it’s kind of hard to resist, haha.

      • The Focus Elf

        Almost entirely artifacts and stutter.

  • Brian Blair

    I have heard allot about the amd cards even the new ones that for no reason artifact in some games or all games, This has me a little concerned since I want to get the R7 260x 2GB to replace my 1GB 650 Ti, It is selling at a decent price right now, $139 , I had a 5850 that started artifacting in my rig after installing Windows 8, But with Windows 7 and with a older driver it worked fine, Kinda has me scared it has something to do with the newer drivers being mixed with certain computer hardware.

    • Rob Williams

      This is the first I’ve heard of artifacting Radeons. That’s not an issue I dealt with across any of the latest Radeons, and I am going to have to jump to conclusions and say that it’d be a very rare thing to experience. Sometimes cards can just show up borked, but that’d again be rare. Could happen on the NVIDIA side just as easy.

  • abrbabr

    what about noise level?