Techgage logo

NVIDIA GeForce GTX 750 Ti Review: 1080p Gaming without a Power Connector

NVIDIA GeForce GTX 750 Ti

Date: February 24, 2014
Author(s): Rob Williams

It’s often hard to get excited about a new $149 graphics card, but NVIDIA’s GeForce GTX 750 Ti becomes one of the rare exceptions. For starters, it doesn’t require a power connector, and it has half the TDP requirement of its nearest competitor – all despite promised performance improvements. What more can be said? Read on!



Introduction

I’ve said it before, but it needs to be said again: AMD and NVIDIA are all about impeccable timing. A couple of weeks ago, I received an email from AMD telling me about its upcoming R7 265, which was set to replace the $150 Radeon R7 260X.

The reason for that model’s sudden existence became clear the following day when I received NVIDIA’s new $150 GeForce GTX 750 Ti in the post. Coincidence? Yep – sure.

Based on the introduction of these cards, it’s clear that both are going to compete well against each other – a point further proven by the fact that our results later will in fact show that the 750 Ti is a bit faster than the R7 260X.

NVIDIA GeForce GTX 750 Ti - Overview

I don’t have an R7 265 here to use for the sake of comparison, but at this point, I’m not sure it matters. As of the time of writing, I haven’t been able to find the model on sale anywhere, and we’re nearing the two-week mark from when the card’s embargo lifted. Meanwhile, NVIDIA’s 750 Ti appeared at e-tail not long after its announcement.

The situation surrounding the R7 265 stings a bit, because when I took at look at the R7 260 back in December, I wasn’t told that the card would be unavailable on these shores – but that happens to be the case. Why would I take a look at a graphics card that isn’t available to most of our audience? I wouldn’t – it’d be nonsensical. So now I’m wondering if we’ll see the same thing happen with the R7 265.

Nonetheless – let’s move onto less-aggravating matters, shall we?

NVIDIA’s GeForce GTX 750 Ti is being targeted at those who are running lower-end GeForces more than a generation old, such as the 550 Ti. In that comparison, NVIDIA touts its latest card as being 120% faster, and while that’s impressive in itself, moreso to me is the fact that it’s more than twice as fast at half the power draw. The 750 Ti is spec’d at 60W, whereas the 550 Ti was spec’d at 116W. The Radeon R7 260X, which was just $149 as mentioned above, is spec’d at 115W.

As simple as the 750 Ti appears to be on paper, it’s become one of the most-impressive graphics cards I’ve taken a look at in some time. Thanks to its 60W power requirement, it doesn’t require a power connector. That means those who own restrictive OEM PCs don’t need to fuss about not having an available power connector, and those that do have one can simply enjoy the fact that the 750 Ti is laughing at it.

NVIDIA GeForce GTX 750 Ti - Reference Cooler

On the topic of laughing, that was the reaction I had when I first received the 750 Ti and took it out of its box. “Ti”, to me, suggests a beefier-than-normal graphics card, but just look at this. I had figured NVIDIA sent me over an unannounced GeForce GT 720 by accident. But no. We might be looking at a simple card here, but it’s not so simple when put into action.

The reason the 750 Ti’s cooler is so simple is that it’s all that’s needed. This strikes me as humorous because this cooler design cannot be purchased; instead, vendors like EVGA, ASUS, ZOTAC and so forth all offer much beefier-looking coolers, which will seem a little odd given this simple reference cooler prevented the card from exceeding 66°C during our stress-test.

NVIDIA GeForce GTX 750 Ti - Video Connectors

The reference 750 Ti includes dual DVI ports along with a mini-HDMI, but that doesn’t matter too much: As always, vendors can pick and choose what to include. One ASUS model, for example, offers dual DVI ports, a regular-sized HDMI port, and VGA.

NVIDIA GeForce Series Cores Core MHz Memory Mem MHz Mem Bus TDP
GeForce GTX Titan Black 2880 889 6144MB 7000 384-bit 250W
GeForce GTX Titan 2688 837 6144MB 6008 384-bit 250W
GeForce GTX 780 Ti 2880 875 3072MB 7000 384-bit 250W
GeForce GTX 780 2304 863 3072MB 6008 384-bit 250W
GeForce GTX 770 1536 1046 2048MB 7010 256-bit 230W
GeForce GTX 760 1152 980 2048MB 6008 256-bit 170W
GeForce GTX 750 Ti 640 1020 2048MB 5400 128-bit 60W
GeForce GTX 750 512 1020 2048MB 5000 128-bit 55W
GeForce GTX 660 960 980 2048MB 6000 192-bit 140W
GeForce GTX 650 384 1058 1024MB 5000 128-bit 64W

NVIDIA might be focusing on the GTX 750 Ti at the moment, but it’s also launched a non-Ti model as well. As seen in the chart above, that card has 20% of the cores cut, and 100MHz taken off of the memory clock. These small changes decrease the card’s cost from $150 to about $120.

As mentioned above, I don’t have an R7 265 to compare the 750 Ti to, which is unfortunate since it’s a direct match-up. But as also mentioned, that card remains unavailable for purchase, so for our testing, I’m instead going to compare it to the previous $150 AMD card, the Radeon R7 260X (which is now available for $120).

Onward we go.

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our test-bed specifications, but also a detailed look at how we conduct our testing.

Test Machine

The below table lists our testing machine’s hardware, which remains unchanged throughout all GPU testing, minus the GPU. Each card used for comparison is also listed here, along with the driver version used.

  Graphics Card Test System
Processors Intel Core i7-4960X – Six-Core @ 4.50GHz
Motherboard ASUS P9X79-E WS
Memory Kingston HyperX Beast 32GB (4x8GB) – DDR3-2133 11-12-11
Graphics AMD Radeon R7 260X 2GB – Catalyst 13.11 (GPU-Z)
NVIDIA GeForce GTX 750 Ti 2GB – GeForce 334.89 (GPU-Z)
Audio Onboard
Storage Kingston HyperX 240GB SSD
Power Supply Cooler Master Silent Pro Hybrid 1300W
Chassis Cooler Master Storm Trooper Full-Tower
Cooling Thermaltake WATER3.0 Extreme Liquid Cooler
Displays ASUS PB278Q 27″ 2560×1440
Dell P2210H 22″ 1920×1080 x 3
Et cetera Windows 7 Professional 64-bit

Notes About Our High-end System

The goal of our performance content is to show you as accurately as possible how one product compares to another – after all, you’re coming to us for advice, so we want to make sure we’re giving you the best possible information. Typically, one major step we take in ensuring that our performance results are accurate is to make sure that our test systems are void of all possible bottlenecks, so for that, high-end components must be used.

In the case of our graphics card test system, the processor chosen has six-cores and is overclocked far beyond reference clocks. Most games nowadays are not heavily CPU-bound, but by using such a chip, we feel that we completely rule it out as a potential bottleneck. The same can be said for the use of an SSD (as opposed to latency-ridden mechanical storage), and even our memory, which is clocked at the comfortable speed of DDR3-2133.

Why this matters to you: Our test PC is high-end, and it’s very likely that you’d encounter a bottleneck quicker than us. Our goals are to rid all possible bottlenecks, whereas yours is to build the PC you need. In our case, we need to go overboard to attain as accurate a representation of a graphic card’s performance as possible.

If your PC has at least a modern (~2-years-old) quad-core or better processor, and at least 8GB of fast memory (DDR3-1866+), that chances of you running into a bottleneck with today’s hottest game is admittedly low. If you’re using lower-end gear, you can absolutely expect that the rest of your system could be a bottleneck. It should be noted, though, that if you’re seeking out a lower-end graphics card, the importance of a bottleneck would of course be lessened.

Unfortunately, we’re not able to test a single card on multiple PC configurations; each single card we test takes at least 3 hours to test, with another 2 hours added on for each additional resolution, and at least another 1~2 hours for our Best Playable results (for up to 11 hours of mostly hands-on testing for a high-end model).

Please bear all of this in mind. If you’re unsure if your PC could prove to be a bottleneck, our comments section exists for such questions.

When preparing our test-beds for any type of performance testing, we follow these guidelines:

General Guidelines

To aid with the goal of keeping accurate and repeatable results, we alter certain services in Windows 7 from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing inaccurate test results. For example, disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

The services we disable are:

For further fine-tuning, we also use Windows’ “Classic” desktop theme, which gets rid of the transparency that can sometimes utilize a GPU in the background.

Vendor Favortism

Sometimes, either AMD or NVIDIA will work with a game studio to help their development process along. As history has proven, this often results in a game that is tuned better for one vendor over the other, although sometimes the tides can change over time, resulting in the competing vendor offering the better experience.

One of our goals is to provide as neutral a benchmarking suite as possible, so while it’s impossible to avoid games sponsored by either of these companies, we can at least make an effort to achieve a blended list. As it stands, our current game list and their partners are:

(AMD) – Battlefield 4
(AMD) – Crysis 3
(AMD) – Sleeping Dogs
(NVIDIA) – Assassin’s Creed IV: Black Flag
(NVIDIA) – Metro: Last Light
(NVIDIA) – Splinter Cell Blacklist
(Neutral) – GRID 2
(Neutral) – Total War: SHOGUN 2

With that, let’s move on to a quick look at the game settings we use in our testing:

Assassin’s Creed IV: Black Flag

Assassin's Creed IV Black Flag Benchmark Settings

Battlefield 4

Battlefield 4 Benchmark Settings

Crysis 3

Crysis 3 Benchmark Settings

Crysis 3 Benchmark Settings

GRID 2

GRID 2 Benchmark Settings

GRID 2 Benchmark Settings

GRID 2 Benchmark Settings

Metro Last Light

Metro Last Light Benchmark Settings

Sleeping Dogs

Sleeping Dogs Benchmark Settings

Sleeping Dogs Benchmark Settings

Splinter Cell Blacklist

Splinter Cell Blacklist Benchmark Settings

Splinter Cell Blacklist Benchmark Settings

Total War: SHOGUN 2

Total War SHOGUN 2 Benchmark Settings

Unigine Heaven

Unigine Heaven 4 Benchmark Settings

Game Tests: Assassin's Creed IV: Black Flag, Battlefield 4

Given the sheer number of titles in the Assassin’s Creed series, it’s a little hard to believe that the first game came out a mere six years ago. You could definitely say that Ubisoft hit the ball out of the park with this one. To date, we’ve never considered an AC game for benchmarking, but given the number of graphical goodies featured in the PC version of Black Flag, that trend now ends.

Assassin's Creed IV Black Flag - 1920x1080

Manual Run-through: The saved game starts us not far from the beginning of the game under a small church which can be climbed to synchronize with the environment. To kick things off, I scale this church and rotate the camera around once, making sure to take in the beautiful landscape; then, I climb back down and run all the way to the water (the top of this small church and the water can be seen in the above screenshot).

Note: For some reason, Ubisoft decided to cap the framerate to 60 FPS in Black Flag even if Vsync is turned off. For most games, this would ruin the chance of it appearing in our benchmarking, but because the game is graphically intensive, I’ve chosen to stick with it, as at higher resolutions, reaching 60 FPS is a perk that will belong only to high-end graphics cards.

NVIDIA GeForce 750 Ti - Assassin's Creed IV: Black Flag (1920x1080)

To kick things off, AMD’s 260X performed a little bit better on the minimum side, while NVIDIA’s 750 Ti clocked in 3 extra FPS. Small differences, and what I’m sure will become the trend for the rest of the article.

Battlefield 4

Thanks to the fact that DICE cares more about PC gaming than a lot of developers, the Battlefield series tends to give us titles that are well-worth benchmarking. Battlefield 3 offered incredible graphics and became a de facto benchmark immediately, so it’s no surprise, then, that BF4 follows right in its footsteps.

Battlefield 4 - 1920x1080

Manual Run-through: The Singapore level is the target here, with the saved game starting us on an airboat that must be driven to shore, where a massive battle is set to take place. I stop recording the framerate once the tank makes its way to the end of this small patch of beach; in all, the run takes about 3 minutes.

NVIDIA GeForce 750 Ti - Battlefield 4 (1920x1080)

With BF 4, NVIDIA edges a bit further ahead than with AC IV, and while we’re dealing with low framerates, 5 FPS equals 18% – rather substantial. Let’s see if that lead continues.

Game Tests: Crysis 3, GRID 2

When the original Crysis dropped in late 2007, it took no time at all for pundits to coin the phrase, “Can it run Crysis?“, almost to the point of self-parody. At the time, the game couldn’t have its graphics detail maxed-out on even top-of-the-line PCs, and in reality, that’s a great thing. I’d imagine few are opposed to knowing that a game could actually look better down the road as our PCs grow into them. As the series continued, Crytek knew it had a legend to live up to, and fortunately, Crysis 3 (our review) lives up to the original’s legacy.

Crysis 3 - 1920x1080 Single Monitor

Manual Run-through: There’s no particular level in Crysis 3 that I could establish was “better” for benchmarking than another, but I settled on “Red Star Rising” based on the fact that I could perform a run-through with no chance of dying (a great thing in a challenging game like this one). The level starts us in a derelict building, where I traverse a broken pipe to make it over to one rooftop and then another. I eventually hit the ground after taking advantage of a zipline, and make my way down to a river, where I scurry past a number of enemies to the end spot beneath a building.

NVIDIA GeForce 750 Ti - Crysis 3 (1920x1080)

As much as NVIDIA would love a total domination here, it’s not getting one.

GRID 2

For those who appreciate racing games that are neither too realistic nor too arcade-like, there’s GRID. In GRID 2 (review), the ultimate goal is to build a racing empire, starting from square one. Unlike most racing titles that have some sort of career, the goal here isn’t to earn cash, but fans. Whether you’re racing around Abu Dhabi’s Yas Marina or tearing through a gorgeous Cote d’Azur coastline, your goal is simple: To impress.

GRID 2 - 1920x1080 Single Monitor

Manual Run-through: The track chosen for my benchmarking is Miami (Ocean Drive). It’s a simple track overall, which is one of the reasons I chose it, and also the reason I choose to do just a single lap (I crash, often, and that affects both the results and my patience). Unlike most games in the suite which I test twice over (save for an oddity in the results), I race this one lap three times over.

NVIDIA GeForce 750 Ti - GRID 2 (1920x1080)

Would you look at that? AMD’s R7 260X surpassed the 750 Ti in this particular game. And to think, if AMD’s $150 R7 265 ever launches at retail, this graph could look more like a blowout.

Game Tests: Metro Last Light, Sleeping Dogs

Crysis has become infamous for punishing even top-end systems, but let’s be fair: The Metro series matches, if not exceeds its requirement for graphical horsepower. That was proven by the fact that we used Metro 2033 in our testing for a staggering three years – only to be replaced by its sequel, Last Light. I’m not particularly a fan of this series, but I am in awe of its graphics even at modest settings.

Metro Last Light - 1920x1080 Single Monitor

Manual Run-through: Because this game is a real challenge to benchmark with for both the reasons of variability in the results and the raw challenge, I choose to use the built-in benchmark here but rely on Fraps to give me more accurate results.

Note: Metro Last Light‘s built-in benchmark is not representative of the entire game; some levels will punish a GPU much worse than this benchmark will (namely, “The Chase”, which has lots of smoke and explosions). What this means is that while these settings might suffice for much of the game, there might be instances where the performance degrades enough during a certain chapter or portion of a chapter to force a graphics setting tweak.

NVIDIA GeForce 750 Ti - Metro Last Light (1920x1080)

We might as well call the results identical at this point.

Sleeping Dogs

Many have called Sleeping Dogs (our review) the “Asian Grand Theft Auto“, but the game does a lot of things differently that helps it stand out of the crowd. For example, in lieu of supplying the player with a gazillion guns, Sleeping Dogs focuses heavily on hand-to-hand combat. There are also many collectibles that can be found to help upgrade your character and unlock special fighting abilities – and if you happen to enjoy an Asian atmosphere, this game should fit the bill.

Sleeping Dogs - 1920x1080 Single Monitor

Manual Run-through: The run here takes place during the chapter “Amanda”, on a dark, dank night. The saved game begins us at the first apartment in the game (in North Point), though that’s not where I begin capturing the framerate. Instead, I first request our motorcycle from the garage. Once set, I begin recording the framerate and drive along a specific path all the way to Aberdeen, taking about two minutes.

NVIDIA GeForce 750 Ti - Sleeping Dogs (1920x1080)

Sleeping Dogs might be an AMD-targeted game, but NVIDIA still managed to keep its 750 Ti ahead here (which could owe its thanks to specific improvements added for the game in the latest GeForce driver).

Game Tests: Splinter Cell: Blacklist, Total War: SHOGUN 2

Tom Clancy is responsible for a countless number of video games, but his Splinter Cell series has become something special, with each game released having been considered “great” overall. The latest in the series, Blacklist, is no exception, and thankfully for us, its graphics are fantastic, and not to mention intensive. For those who love a stealth element in their games, this is one that shouldn’t be skipped.

RIP, Tom Clancy.

Splinter Cell Blacklist - 1920x1080 Single Monitor

Manual Run-through: From the start of the ‘Safehouse’ level in Benghazi, Libya, we progress through until we reach an apartment building that must be entered – this is where we end the FPS recording.

NVIDIA GeForce 750 Ti - Splinter Cell: Blacklist (1920x1080)

At this point, I’m close to turning on some loud music to keep me awake. These results are close, close, close.

Total War: SHOGUN 2

Strategy games are well-known for pushing the limits of any system, and few others do this as well as Total War: SHOGUN 2. It fully supports DX11, has huge battlefields to oversee with hundreds or thousands of units, and a ton of graphics options to adjust. It’s quite simply a beast of a game.

Total War: SHOGUN 2 - 1920x1080 Single Monitor

Manual Run-through: SHOGUN 2 is one of the few games in our suite where the built-in benchmark is opted for. Strategy games in particular are very difficult to benchmark, so this is where I become thankful to have the option of using a built-in benchmark.

NVIDIA GeForce 750 Ti - Total War: SHOGUN 2 (1920x1080)

Once again, sameness. Let’s see how the “Best Playable” results fare, on the next page.

Best Playable: 1080p Single Display

For about as long as GPU-accelerated games have existed, an ideal performance target has been 60 frames-per-second. Owing thanks to this is the standard 60Hz monitor, which delivers its best result when the framerate matches its refresh rate. To make sure the monitor’s refresh rate and game’s framerate keep aligned, to avoid visible tearing, VSync should be enabled.

While I believe our Best Playable results will appeal to any gamer, they could especially prove useful to those intrigued by livingroom gaming or console replacements. The goal here is simple: With each game, the graphics settings are tweaked to deliver the best possible detail while keeping us as close to 60 FPS on average as possible.

Because our Metro Last Light and Total War: SHOGUN 2 tests are timedemos, and because this kind of testing is time-consuming, I am sticking to six out of the eight games I test with for inclusion here.

Our regular benchmark tests showed that the R7 260X and 750 Ti are about the same, with NVIDIA getting the slight edge. The differences are so minor, that the Best Playable settings for each game have been kept identical for both cards – giving us the benefit of gaining results for both Best Playable and apples-to-apples.

  Assassin’s Creed IV: Black Flag
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 46 56
Graphics Settings
& Screenshot
Environment: High Shadow: Normal
Texture: High Reflection: Normal
Anti-aliasing: FXAA God Rays: Off
Ambient Occlusion: Off Volumetric Fog: On
Motion Blur On  
Assassin's Creed IV Black Flag - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 51 60
Graphics Settings
& Screenshot
Environment: High Shadow: Normal
Texture: High Reflection: Normal
Anti-aliasing: FXAA God Rays: Off
Ambient Occlusion: Off Volumetric Fog: On
Motion Blur On  
Assassin's Creed IV Black Flag - Best Playable - AMD Radeon R7 260X

With the standard settings I use for AC IV, NVIDIA’s card came a little bit ahead of AMD’s. But something odd happened when I tested the card with the Best Playable settings: The roles reversed. I can’t summon the logic to explain why this is the case, but multiple tests proved these results to be consistent.

  Battlefield 4
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 51 68
Graphics Settings
& Screenshot
Texture Quality: High Texture Filtering: High
Lighting: High Effects: High
Post Processing: High Mesh: High
Terrain: High Terrain Decoration: High
Anti-aliasing Deferred: Off Anti-aliasing Post: Off
Ambient Occlusion: Off    
Battlefield 4 - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 49 62
Graphics Settings
& Screenshot
Texture Quality: High Texture Filtering: High
Lighting: High Effects: High
Post Processing: High Mesh: High
Terrain: High Terrain Decoration: High
Anti-aliasing Deferred: Off Anti-aliasing Post: Off
Ambient Occlusion: Off    
Battlefield 4 - Best Playable - AMD Radeon R7 260X

NVIDIA’s 750 Ti somehow lagged behind the R7 260X in AC IV, but it manages the opposite with BF 4.

  Crysis 3
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 36 55
Graphics Settings
& Screenshot
Anti-aliasing: Off Texture: Medium
Effects: Medium Object: Medium
Particles: Medium Post Processing: Medium
Shading: Medium Shadows: Low
Water: Low Anisotropic Filtering: x16
Motion Blur: Medium Lens Flares: Yes
Crysis 3 - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 36 54
Graphics Settings
& Screenshot
Anti-aliasing: Off Texture: Medium
Effects: Medium Object: Medium
Particles: Medium Post Processing: Medium
Shading: Medium Shadows: Low
Water: Low Anisotropic Filtering: x16
Motion Blur: Medium Lens Flares: Yes
Crysis 3 - Best Playable - AMD Radeon R7 260X

AC IV and BF 4 saw the tested cards swap places, but with Crysis 3, the results are what I’d consider identical.

  GRID 2
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 49 59
Graphics Settings
& Screenshot
Multisampling: 4x MSAA Night Lighting: High
Shadows: Ultra Advanced Fog: On
Particles: Ultra Crowd: Ultra
Cloth: High Ambient Occlusion: Low
Soft Ambient Occlusion: Off Ground Cover: High
Vehicle Details: High Trees: Ultra
Objects: Ultra Vehicle Reflections: Ultra
Water: High Post Process: High
Skidmarks: On Advanced Lighting: On
Global Illumination: Off Anisotropic Filtering: Ultra
GRID 2 - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 50 58
Graphics Settings
& Screenshot
Multisampling: 4x MSAA Night Lighting: High
Shadows: Ultra Advanced Fog: On
Particles: Ultra Crowd: Ultra
Cloth: High Ambient Occlusion: Low
Soft Ambient Occlusion: Off Ground Cover: High
Vehicle Details: High Trees: Ultra
Objects: Ultra Vehicle Reflections: Ultra
Water: High Post Process: High
Skidmarks: On Advanced Lighting: On
Global Illumination: Off Anisotropic Filtering: Ultra
GRID 2 - Best Playable - AMD Radeon R7 260X

Once again, we see a difference of a mere 1 FPS. I don’t want to think of the caffeine kick required to notice that difference in the real-world.

  Sleeping Dogs
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 62 75
Graphics Settings
& Screenshot
Anti-aliasing: Normal High-res Textures: On
Shadow Resolution: High Shadow Filtering: High
Ambient Occlusion: High Motion Blur: High
World Density: Extreme  
Sleeping Dogs - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 62 71
Graphics Settings
& Screenshot
Anti-aliasing: Normal High-res Textures: On
Shadow Resolution: High Shadow Filtering: High
Ambient Occlusion: High Motion Blur: High
World Density: Extreme  
Sleeping Dogs - Best Playable - AMD Radeon R7 260X

Sleeping Dogs is AMD’s game, but the latest optimizations in NVIDIA’s GeForce drivers helped give the 750 Ti a lead.

  Tom Clancy’s Splinter Cell: Blacklist
1920×1080 Minimum Average
NVIDIA GeForce GTX 750 Ti 60 72
Graphics Settings
& Screenshot
Texture Detail: Medium Shadow: Medium
Parallax: On Tessellation: Off
Texture Filtering: 16x Ambient Occlusion: Field AO
Anti-aliasing: Off  
Tom Clancy's Splinter Cell Blacklist - Best Playable - NVIDIA GeForce GTX 750 Ti
AMD Radeon R7 260X 51 77
Graphics Settings
& Screenshot
Texture Detail: Medium Shadow: Medium
Parallax: On Tessellation: Off
Texture Filtering: 16x Ambient Occlusion: Field AO
Anti-aliasing: Off  
Tom Clancy's Splinter Cell Blacklist - Best Playable - AMD Radeon R7 260X

It looks like we couldn’t wrap-up the Best Playable results without some oddball findings. For whatever reason, AMD’s card performed better on average, but NVIDIA’s better at the minimum. Regardless of how these results are interpreted, though, one thing’s for sure: Both play at above 60 FPS, which is important.

Synthetic Tests: Futuremark 3DMark, 3DMark 11, Unigine Heaven 4.0

We don’t make it a point to seek out automated gaming benchmarks, but we do like to get a couple in that anyone reading this can run themselves. Of these, Futuremark’s name leads the pack, as its benchmarks have become synonymous with the activity. Plus, it does help that the company’s benchmarks stress PCs to their limit – and beyond.

3DMark

While Futuremark’s latest GPU test suite is 3DMark, I’m also including results from 3DMark 11 as it’s still a common choice among benchmarkers.

NVIDIA GeForce 750 Ti - Futuremark 3DMark

NVIDIA GeForce 750 Ti - Futuremark 3DMark 11 - Performance

NVIDIA GeForce 750 Ti - Futuremark 3DMark 11 - Extreme

I’d love it if benchmarking oddities ceased to exist, but this is the real-world, and that’s just not the case. NVIDIA’s 750 Ti bests AMD’s R7 260X in 3DMark (2013), and the roles are reversed in 3DMark 11′s Performance test, but not in its Extreme test. That almost suggests that AMD’s card is better at sub-1080p resolutions, whereas NVIDIA’s is best for 1080p.

Unigine Heaven 4.0

Unigine might not have as established a name as Futuremark, but its products are nothing short of “awesome”. The company’s main focus is its game engine, but a by-product of that is its benchmarks, which are used to both give benchmarkers another great tool to take advantage of, and also to show-off what its engine is capable of. It’s a win-win all-around.

Unigine Heaven 4.0

The biggest reason that the company’s “Heaven” benchmark is so relied-upon by benchmarkers is that both AMD and NVIDIA promote it for its heavy use of tessellation. Like 3DMark, the benchmark here is overkill by design, so results are not going to directly correlate with real gameplay. Rather, they showcase which card models can better handle both DX11 and its GPU-bogging features.

NVIDIA GeForce 750 Ti - Unigine Heaven 4.0 (1920x1080)

NVIDIA’s tessellation improvements show their face here.

Overclocking & Power, Final Thoughts

I admit that for the most part, I find GPU overclocking to be unimportant. As an end-user, I just couldn’t justify putting extra stress on an already-complex piece of hardware in order to gain, at best, 10% on the framerate. However, I will say one thing: Where real advantages of GPU overclocking can be seen is at the low-end of the spectrum, such as with the GTX 750 Ti.

I’m pleased to report that this holds true with this card. Its default clocks are 1020MHz for the GPU and 1350MHz for the memory, and using EVGA’s Precision tool, I was able to achieve a stable overclock of 1155MHz on the GPU, and 1550MHz on the memory.

In what’s sure to be an NVIDIA limitation, which undoubtedly has to do with the fact that the 750 Ti doesn’t require a power connector, every GPU overclocking tool I’ve tried capped the GPU at around 1155MHz. This is despite EVGA offering a 750 Ti model clocked at 1176MHz – clearly, a custom BIOS has been implemented there.

NVIDIA GeForce GTX 750 Ti - Overclock

To give these overclocked settings a quick test, I reran all of the “Best Playable” settings using them, and imported the same results from the default clocks to produce this graph below:

NVIDIA GeForce 750 Ti - Overclocked Results

The results were kept just about identical for AC IV, but all of the other games saw some rather substantial gains – at least +6 FPS, and more often +7. In the case of Blacklist, the overclock managed an additional 10 FPS, equivalent to a 14% gain.

Power & Temperatures

To test graphics cards for both their power consumption and temperature at load, we utilize a couple of different tools. On the hardware side, we use a trusty Kill-a-Watt power monitor which our GPU test machine plugs into directly. For software, we use Futuremark’s 3DMark to stress-test the card, and AIDA64 to monitor and record the temperatures.

To test, the general area around the chassis is checked with a temperature gun, with the average temperature recorded. Once that’s established, the PC is turned on and left to site idle for ten minutes. At this point, AIDA64 is opened along with 3DMark. We then kick-off a full suite run, and pay attention to the Kill-a-Watt when the test reaches its most intensive interval (GT 1) to get the load wattage.

NVIDIA GeForce 750 Ti - Temperatures

NVIDIA GeForce 750 Ti - Power Consumption

Given the lack of a need for a power connector, the power draw results above don’t strike me as too much of a surprise, but the temperature results sure do. AMD’s R7 260X has a substantial-looking cooler, while NVIDIA’s GTX 750 Ti is about as simple as it gets. Despite all that, the 750 Ti didn’t edge past 66°C. This is interesting, because with temps this low, it makes the larger coolers seen on retail cards look a bit foolish. It’s as though no vendor wanted to stick with this simple design based on the fact that it wouldn’t give the allure of the 750 Ti being a powerful card.

It doesn’t need to be said, but I’ll say it anyway: Impressive results here.

Final Thoughts

We discovered above that the 750 Ti is much more power-efficient than AMD’s R7 260X (and no doubt the R7 265, as well), and it also runs a lot cooler. But power and temperatures are not all that matters with a GPU, so overall, how does the GTX 750 Ti fare?

If the R7 260X were still selling for $149, the GTX 750 Ti would be a no-brainer, given the just-mentioned benefits and the fact that it performs better than AMD’s card in most tests. However, with the release of the R7 265, AMD has dropped the price of the R7 260X to $120, a price that can be had right this moment at popular e-tailers. When looking at things from the cost angle, the 750 Ti’s performance does not match its price-premium (25%).

For those looking to achieve the best bang for the buck, it’s with AMD’s R7 260X in this match-up. Plus, given the 14% boost in core count the R7 265 sees, it should match the 750 Ti in most gaming tests and perhaps best it in others – and because it’s set to retail at the same $149 price-point, that’s important. However, that assumes that the R7 265 will at some point become available for purchase, something I’ll have to see to believe.

NVIDIA GeForce GTX 750 Ti - Overview

So when looking at the facts here, what would cause someone to spend 25% ($30) more for NVIDIA’s GPU, which performs just 0~10% better? Well, for starters, there’s the NVIDIA-specific perks, like ShadowPlay (a technology I wouldn’t want to live without at this point). There’s also things like PhysX, but that’s about as useful on a low-end discrete card as DirectX 11 is on integrated graphics – it’s best left to the big guns.

Given the results we’ve seen, I do believe that the 750 Ti could use a bit of a price adjustment, but let’s not gloss over the fact that NVIDIA’s card is far more power-efficient than AMD’s, and will run a lot cooler. Remember, AMD’s R7 260X peaks at 81°C, whereas the 750 Ti with its simplistic cooler was capped at 66°C – a staggering drop of 25°C. There’s also the fact that it manages to be faster overall, but consume 31W less at full load.

And while it won’t affect most 750 Ti owners, I’m sure, the fact the card doesn’t require a power connector is major. That allows those with restrictive OEM PCs the ability to upgrade to a card that can deliver 1080p gaming with ease, all without the likely need of a PSU upgrade. While an OEM PC might include a PSU as small as 300W, that’s quite substantial when dealing with such low-power parts. At 60W, that’s a mere 1/5th of a 300W unit, and bear in mind, our test rig, which has 5 high-speed fans, an overclocked six-core Intel processor, and 32GB of RAM, peaked at 242W – nowhere near a 300W cap.

And yet, all of the games we tested could be enjoyed at 1080p resolution and with good detail. That, to me, is damned impressive.

NVIDIA GeForce GTX 750 Ti - Techgage Editor's Choice
NVIDIA GeForce GTX 750 Ti

Copyright © 2005-2014 Techgage Networks Inc. - All Rights Reserved.