Latest News Posts

Social
Latest Forum Posts

NVIDIA GeForce GTX 750 Ti Review: 1080p Gaming without a Power Connector
Bookmark and Share

NVIDIA GeForce GTX 750 Ti
Print
by Rob Williams on February 24, 2014 in NVIDIA-Based GPU

It’s often hard to get excited about a new $149 graphics card, but NVIDIA’s GeForce GTX 750 Ti becomes one of the rare exceptions. For starters, it doesn’t require a power connector, and it has half the TDP requirement of its nearest competitor – all despite promised performance improvements. What more can be said? Read on!

Best Playable: 1080p Single Display

For about as long as GPU-accelerated games have existed, an ideal performance target has been 60 frames-per-second. Owing thanks to this is the standard 60Hz monitor, which delivers its best result when the framerate matches its refresh rate. To make sure the monitor’s refresh rate and game’s framerate keep aligned, to avoid visible tearing, VSync should be enabled.

While I believe our Best Playable results will appeal to any gamer, they could especially prove useful to those intrigued by livingroom gaming or console replacements. The goal here is simple: With each game, the graphics settings are tweaked to deliver the best possible detail while keeping us as close to 60 FPS on average as possible.

Because our Metro Last Light and Total War: SHOGUN 2 tests are timedemos, and because this kind of testing is time-consuming, I am sticking to six out of the eight games I test with for inclusion here.

Our regular benchmark tests showed that the R7 260X and 750 Ti are about the same, with NVIDIA getting the slight edge. The differences are so minor, that the Best Playable settings for each game have been kept identical for both cards – giving us the benefit of gaining results for both Best Playable and apples-to-apples.

  Assassin’s Creed IV: Black Flag
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 46 56
Graphics Settings
& Screenshot
Environment: High Shadow: Normal
Texture: High Reflection: Normal
Anti-aliasing: FXAA God Rays: Off
Ambient Occlusion: Off Volumetric Fog: On
Motion Blur On  
Assassin's Creed IV Black Flag - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 51 60
Graphics Settings
& Screenshot
Environment: High Shadow: Normal
Texture: High Reflection: Normal
Anti-aliasing: FXAA God Rays: Off
Ambient Occlusion: Off Volumetric Fog: On
Motion Blur On  
Assassin's Creed IV Black Flag - Best Playable - AMD Radeon R7 260X

With the standard settings I use for AC IV, NVIDIA’s card came a little bit ahead of AMD’s. But something odd happened when I tested the card with the Best Playable settings: The roles reversed. I can’t summon the logic to explain why this is the case, but multiple tests proved these results to be consistent.

  Battlefield 4
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 51 68
Graphics Settings
& Screenshot
Texture Quality: High Texture Filtering: High
Lighting: High Effects: High
Post Processing: High Mesh: High
Terrain: High Terrain Decoration: High
Anti-aliasing Deferred: Off Anti-aliasing Post: Off
Ambient Occlusion: Off    
Battlefield 4 - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 49 62
Graphics Settings
& Screenshot
Texture Quality: High Texture Filtering: High
Lighting: High Effects: High
Post Processing: High Mesh: High
Terrain: High Terrain Decoration: High
Anti-aliasing Deferred: Off Anti-aliasing Post: Off
Ambient Occlusion: Off    
Battlefield 4 - Best Playable - AMD Radeon R7 260X

NVIDIA’s 750 Ti somehow lagged behind the R7 260X in AC IV, but it manages the opposite with BF 4.

  Crysis 3
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 36 55
Graphics Settings
& Screenshot
Anti-aliasing: Off Texture: Medium
Effects: Medium Object: Medium
Particles: Medium Post Processing: Medium
Shading: Medium Shadows: Low
Water: Low Anisotropic Filtering: x16
Motion Blur: Medium Lens Flares: Yes
Crysis 3 - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 36 54
Graphics Settings
& Screenshot
Anti-aliasing: Off Texture: Medium
Effects: Medium Object: Medium
Particles: Medium Post Processing: Medium
Shading: Medium Shadows: Low
Water: Low Anisotropic Filtering: x16
Motion Blur: Medium Lens Flares: Yes
Crysis 3 - Best Playable - AMD Radeon R7 260X

AC IV and BF 4 saw the tested cards swap places, but with Crysis 3, the results are what I’d consider identical.

  GRID 2
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 49 59
Graphics Settings
& Screenshot
Multisampling: 4x MSAA Night Lighting: High
Shadows: Ultra Advanced Fog: On
Particles: Ultra Crowd: Ultra
Cloth: High Ambient Occlusion: Low
Soft Ambient Occlusion: Off Ground Cover: High
Vehicle Details: High Trees: Ultra
Objects: Ultra Vehicle Reflections: Ultra
Water: High Post Process: High
Skidmarks: On Advanced Lighting: On
Global Illumination: Off Anisotropic Filtering: Ultra
GRID 2 - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 50 58
Graphics Settings
& Screenshot
Multisampling: 4x MSAA Night Lighting: High
Shadows: Ultra Advanced Fog: On
Particles: Ultra Crowd: Ultra
Cloth: High Ambient Occlusion: Low
Soft Ambient Occlusion: Off Ground Cover: High
Vehicle Details: High Trees: Ultra
Objects: Ultra Vehicle Reflections: Ultra
Water: High Post Process: High
Skidmarks: On Advanced Lighting: On
Global Illumination: Off Anisotropic Filtering: Ultra
GRID 2 - Best Playable - AMD Radeon R7 260X

Once again, we see a difference of a mere 1 FPS. I don’t want to think of the caffeine kick required to notice that difference in the real-world.

  Sleeping Dogs
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 62 75
Graphics Settings
& Screenshot
Anti-aliasing: Normal High-res Textures: On
Shadow Resolution: High Shadow Filtering: High
Ambient Occlusion: High Motion Blur: High
World Density: Extreme  
Sleeping Dogs - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 62 71
Graphics Settings
& Screenshot
Anti-aliasing: Normal High-res Textures: On
Shadow Resolution: High Shadow Filtering: High
Ambient Occlusion: High Motion Blur: High
World Density: Extreme  
Sleeping Dogs - Best Playable - AMD Radeon R7 260X

Sleeping Dogs is AMD’s game, but the latest optimizations in NVIDIA’s GeForce drivers helped give the 750 Ti a lead.

  Tom Clancy’s Splinter Cell: Blacklist
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 60 72
Graphics Settings
& Screenshot
Texture Detail: Medium Shadow: Medium
Parallax: On Tessellation: Off
Texture Filtering: 16x Ambient Occlusion: Field AO
Anti-aliasing: Off  
Tom Clancy's Splinter Cell Blacklist - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 51 77
Graphics Settings
& Screenshot
Texture Detail: Medium Shadow: Medium
Parallax: On Tessellation: Off
Texture Filtering: 16x Ambient Occlusion: Field AO
Anti-aliasing: Off  
Tom Clancy's Splinter Cell Blacklist - Best Playable - AMD Radeon R7 260X

It looks like we couldn’t wrap-up the Best Playable results without some oddball findings. For whatever reason, AMD’s card performed better on average, but NVIDIA’s better at the minimum. Regardless of how these results are interpreted, though, one thing’s for sure: Both play at above 60 FPS, which is important.


  • Casecutter

    Rob, very nice write-up. Nice to see the “playable setting” something l’d like to see more at this level of card. Also I’ve never seen a review state the “Vendor Favoritism” to what group (AMD/Nvidia) that assisted in backing the release… kudos! One thing, I know this reasons you work from the i7 (and OC’d), but I believe it should be stated that those Max settings results, especially when it comes to minimum frames from either card would in actuality take a notable hit, to in some cases not offer playable results. Most buying this level card at top would mean working from some i5-4440 at minimum, while plenty of older i3 and Phenom II X4 like 945 Deneb 3.0GHz.

    As to power I’m surprised that the power delta under load wasn’t more, I mean it’s really only 10-12% difference. It’s good but IDK, considering the R7 260X gives you ZeroCore, which in today’s world when most “sleep” their computer that 3-5W drop over days will adds up… more than the 10% when gaming a few times a week.

    Another is the fact that the basic versions that held the $150 price point have evaporated (some say “sold-out” but either way I’d see them as a rare birds anymore) and now all that out ther are the AIB customs with as you put it the “beefier-looking coolers”. The problem with that is Newegg is pricing them at $170-180 now. Sure that perhaps the new-ness factor, let’s hope that price is tempered a little over the next few weeks. Heck with Nvidia work from a 7% smaller die they should be able to be more value oriented than the R7 260X, which today is like $120-130 even one at $110 working a $20 rebate. That 40% difference pays for a lot of electricity.

    Here my thinking you be better off dumping the old and most likely inefficient 300W (or less) PSU for something like the Corsair CX430M 80+ Bronze Modular Active PFC PSU that $30 –AR$20. Then if really entry gaming like a young teen a 260X is acceptable; want really more often higher settings/some AA see about a R7 265 or a good deal on a GTX660. If spending $170 to comprise on power as not buying as PSU is throwing good money after bad. This 750Ti is most sensible if building a HTPC, but it falls a little short on price for any gaming machine/upgrade.

    • http://techgage.com/ Rob Williams

      Thanks for the detailed comment, once again!

      “kudos”

      You deserve the kudos for actually noticing :-)

      “I know this reasons you work from the i7 (and OC’d)”

      I agree. I’ll add a note to the page soon about that, and keep that mention there in future content. I’ve been questioned about the decision to use high-end gear like that, but at the end of the day, the goal is to rid all bottlenecks (as it seems you are completely aware). I actually think we’re reaching a time where the CPU can be more of a bottleneck to a game than some people give credit, so it sounds like the premise for an article down the road.

      “As to power I’m surprised that the power delta under load wasn’t more”

      You’re not alone; basic logic would suggest that with the 260X being a 115W TDP card, and the 750 Ti a 60W one, we’d see more than a 31W delta, but not so. The reason could be that the reported TDPs are inaccurate, or there’s simply something else at play. Admittedly, I report the maximum value spotted during testing (twice over to verify), so we might very well see larger deltas if I were to record the wattage-over-time from a real-world game, and not a benchmark. Of course this would be in a perfect world; in my world I have a Kill-a-Watt.

      “Another is the fact that the basic versions that held the $150 price point have evaporated ”

      Ahh, fantastic =/ I looked at EVGA’s site and all of them have changed to “Auto-notify”. I’ll check with NVIDIA to see if I can get a reason for it, and see if a solution is en route, but I expect them to play coy as usual.

      With AMD’s inflation and now this, the GPU market has truly been put into a blender lately.

      I like your analysis at the end. It’s kind of frustrating just how much constant research people have to do to find the perfect GPU… things seem to change on a daily basis. When this article was pubbed, a $150 Ti was great; things are skewed when it becomes $180. Granted, the cards I see at Newegg are all overclocked, but even so… why on earth would there be stock of those and not the regular variants? Honestly, I find it odd that there’s OC variants of such a card at all… at $150 it’s already a bit overpriced; it just happens to offer unparalleled power consumption which helps negate that premium.

  • Casecutter

    Here’s how I see this, the 750Ti is what comprises the “entry, no 6-pin, plug-n-play market”; no different than the 5670 was back in beginning of 2010… so 4 years ago. Similar for that time 1680x was resolution of the day for the category, the 5670 could give you most titles on medium settings, but at that time it was a $75-80 upgrade with 1Gb GDDR5. In four years it’s at minimum 100% increase, that doesn’t fly!

    Against the 650Ti which MSRP for $150 I suppose it seems good, but that was overtly priced, as that used a 221mm die. Sure it was hard for Nvidia to get that down much more, but now it’s like 33% smaller and can’t provide some relief?

    If we look at what PC Perspective learned in their Upgrade Story we find that they couldn’t or didn’t feel they could provide the best graphic/playable experience most often with low settings, although Grid and Syrim provide medium that was with the best OEM box, a Core Gateway DX4885 with a i5-4440. I think working from that i5 machine or a Phenom II X4 like 945 Deneb 3.0GHz set-ups, and use a R7 250/7750 (no 6-pin), the R7 260X, and then find the best playable. I don’t consider the experience that comes across on the screen any much different between R7 250 and a GTX750Ti, while I’d say the GTX750Ti / R7 260X would basically spar with same settings and FpS. The difference the R7 260X leaves money for a nice Bronze+ PSU and Zerocore. If two twin machines… slept, browsed, and gamed identically over a month what either Kill-a-Watt record as total power used? That’s the story…

    • http://techgage.com/ Rob Williams

      You certainly remember things are lot better than I do; I curse my horrible memory sometimes. Once a new series comes out I quickly forget about the one before it.

      “Sure it was hard for Nvidia to get that down much more, but now it’s like 33% smaller and can’t provide some relief?”

      I think this comes back to the “Because it can” scenario, where it doesn’t feel compelled to lower its prices because people are paying what it’s asking. It’s better for the bottom-line, after all, to not discount prices when it’s not needed. Unfortunately, such a stance should prove to be a great thing for AMD, it not for the inflation issues. Once those pass, I’m sure NVIDIA will become price-competitive once again out of nowhere, as if nothing happened.

      I hadn’t heard about that PC Per article until now; it’s quite a good angle to tackle a card like this from. Given the way Ryan tested the systems, it’s pretty hard to compare his results to mine. That Gateway machine packs a pretty decent modern Intel quad-core (3.0GHz) with 8GB of 1600 RAM, so that to me shouldn’t prove to be too much of a bottleneck. But despite that, Crysis 3 was benchmarked @ Low, whereas I found Medium to be playable, and likewise, GRID 2 was tested at Medium, whereas I found it to be completely playable with almost maxed-out settings.

      Ryan might have been stuck between a rock and a hard place though, choosing presets that could be run across each setting. I’m not sure that gives the consumer a great idea of what the card could do when manual tweaking is involved, though. The problem with using presets is that certain settings can be applied that can cripple a game. In the case of a game like GRID 2, Ambient Occlusion and Global Illumination are sme real killers; so which would you prefer? GRID 2 @ Medium, or nearly max with 4xAA + GI/AO disabled? The same could be said for Crysis 3; I found Medium to be playable when Water, Shadows were put to Low and AA was disabled, while Ryan chose the Low preset.

      Either way, I don’t have those systems so I can’t claim that the Best Playable I found for this card would carry over perfectly to even that Gateway rig with ample Intel quad-core. It’s an interesting look, nonetheless.

Advertisement