Need More Confirmation of Xbox One’s Lacking Performance? Look to Titanfall

Posted on March 10, 2014 3:00 PM by Rob Williams

Both Microsoft’s Xbox One and Sony’s PlayStation 4 are under-powered, a fact that all but the most ardent console fans would scoff at. I call it a “fact” based on one major reason: Neither can handle games at 1080p well, or at all. Of the two, the PS4’s beefier graphics chip helps it attain that much-sought-after resolution more often.

In the summer of 2005, I purchased a new television, an LG 720p model. That November, I picked up the Xbox 360 at launch. The following year, after picking up the PlayStation 3, I had begun to regret not shelling out more cash for a 1080p television the year before – Ridge Racer 7, a game I loved, would have supported it.

It’s this kind of thing that makes the current crop of consoles’ inabilities to run at 1080p reliably all-the-more frustrating. Note that the experience I relayed above happened more than seven years ago. Yet, these seven years later, the brand-new consoles can’t handle that same (and admittedly modest) resolution reliably.

Titanfall - Xbox One vs PC
Credit: mcmalloy / Reddit

While a PC is going to be more expensive than an Xbox One out-of-the-gate (unless you really want to penny-pinch), the benefits are going to outweigh that initial extra cost. As GPUs like NVIDIA’s GeForce GTX 750 Ti (our review) have proven, cost-effective 1080p gaming is more than possible on the PC. The GTX 750 Ti costs $150, and not only do games look great on it, they run at 60 FPS (refer to the “Best Playable” page in that review). Today, the biggest challenge in building a capable PC for the same price as the Xbox One is the cost of the Windows license.

Meanwhile, console gamers have to wonder what exactly they’re going to be given when a new game comes out. Games that have been released since both of the consoles launched prove that there’s a major trade-off to be had when developers aim to bring a compelling game to market: Opt for 1080p/30, or noticeably reduce the game’s graphical detail. Admittedly, in this case, I’d say 720p should be the goal, focusing entirely on graphics fidelity at 30 FPS (ugh – painful).

Still, and understandably so, gamers want to see that 1080p resolution, and because of it, developers have been fighting themselves to figure out how to make it happen. In the case of Titanfall, the developer has even gone as far as to use non-standard resolutions to inch its way towards 1080p. I’d like to think that if you have to set your game to 792p, and possibly 900p for launch, it’s a sure sign that Microsoft and Sony should have done more to release their respective consoles with more graphics power.

Titanfall - PC 1080p

Yes – increasing graphics power would have increased the price, but when developers face these kinds of hassles right out-of-the-gate, things are only going to look sillier as time goes on. And while developers will eke more performance out of both consoles over their lifespan, I don’t think we’re going to see gains like we have in previous generations. I am confident in saying that 1080p/30 will not be a common resolution for FPS titles when either of these consoles near the next launch. Meanwhile, PC hardware is going to continue to progress, and if a graphics card like NVIDIA’s GeForce GTX 750 Ti is a gauge to anything, the future is very bright for the platform.

Dare I even mention that the PC version of Titanfall supports DirectX 11, 3D, and 4K?

  • Kougar

    Dunno. Titanfall devotes 35GB to uncompressed audio on the PC because, apparently, decompressing requires too much CPU power on dual-core systems. So Titanfall already has some giant questionable issues with it, makes me wonder how much is genuine graphics quality issues versus just poor development (a la Crysis)

    • Rob Williams

      I’d be quicker to believe that it’s more optimized for the Xbox One, though. I assume the developers just cared less about the PC version because they knew the hardware would be able to better compensate for the crap coding.

      • Kougar

        The Xbone uses hardware accelerated decoding for the audio decompression so they aren’t even using uncompressed audio for the Xbox version, there’d be no need for it.

  • Thespian Sassafraz

    I agree with your point about Titanfall being pretty much a disaster. I would not go so far as to say that both the consoles are underpowered, however. I’m a PC gamer %95 of the time, so normally I wouldn’t care about either console. Yet the XboxOne was a huge disappointment when I read it’s specs when compared to the PS4. 2133mhz DDR3?! I can get that for my PC right now. Next gen? I just hope the PS4 does better with whatever it chooses for it’s flagship release. It’s stats look awesome (in theroy) with 5500MHz GDDR5, and a (again, theoretically) better GPU.

    Now, let me say, Titanfall is a worthlessly coded and developed game. Double that when you consider EA bankrolling what I thought were experienced developers. This thing runs horrible on top tier PCs. The final release was nearly identical to the beta, including issues like screen tearing, not having a solo mode, google for endless lists of problems, on both PC and the box. Running Valves Source engine should easily let this game perform at 1080p, 60hz on any decent gaming rig, but uncompressed audio, screen tearing, framerate drops, and lack of content all point to the developer in my book.

    I just hope the gaming industry survives this debacle.

    • Rob Williams

      Thanks for the comment!

      There’s a lot more that drives performance than just RAM speed. Keep in mind that even on the PC, most people can’t take full advantage of DDR3-2133 – it takes a special case to do that. It’s nice to have the headroom, but straight gaming and desktop work is not going to touch it. Also, it’s impossible to compare desktop DDR to graphics DDR – the PS4 wins the speed battle, but whether or not that makes a true difference in execution is another story.

      Here’s an angle to look at things at: AMD’s current-gen desktop CPUs are not favorable versus Intel chips from a pure performance standpoint. It takes an “8-core” AMD to match a 4-core Intel if the frequency is similar. Now, realize that the CPUs in both of these consoles CPUs is ~2GHz, whereas on the PC, most of AMD’s APUs sit above 2.5GHz – and a bunch settle in at around 4GHz. And even -then-, the CPUs are underpowered compared to mainstream Intel chips. This isn’t a jab at AMD, this is just me looking at this from a pure performance standpoint.

      Then there’s the graphics… integrated. On the desktop, I balk at that kind of thing outside of HTPC use. I reviewed a couple $150 graphics cards recently that can handle all of today’s games at very good detail settings and 1080p. The Xbox One can barely handle Titanfall at 900p with modest detail settings.

      As I mentioned in the post, a PC is going to cost more, but it’s going to deliver a lot more (you obviously know this, however). And that’s not even looking at the possibility of upgrading, which the consoles can’t do. Right now, you can build a gaming PC for not much more than the Xbox One that delivers much better-looking games, and offers open-ended flexibility. You could upgrade it two years from now if you wanted (admittedly, the CPU is rarely the need for an upgrade nowadays, it’s the GPU). Just imagine how underpowered these consoles will look next summer, or the summer afterwards.

      This could almost be an article, but I can’t rail on the consoles TOO much.

      As far as Titanfall goes, you’re not the first person to complain about its horrendous performance on the PC. It’s the main reason I haven’t considered it for benchmarking. Things could get better, but for that to happen the developer needs to learn to care. And some just don’t when it comes to PC gaming. The fact that the game is on the PC at all surprises me.

Recent Tech News
Recent Site Content