Both Microsoft’s Xbox One and Sony’s PlayStation 4 are under-powered, a fact that all but the most ardent console fans would scoff at. I call it a “fact” based on one major reason: Neither can handle games at 1080p well, or at all. Of the two, the PS4′s beefier graphics chip helps it attain that much-sought-after resolution more often.
In the summer of 2005, I purchased a new television, an LG 720p model. That November, I picked up the Xbox 360 at launch. The following year, after picking up the PlayStation 3, I had begun to regret not shelling out more cash for a 1080p television the year before – Ridge Racer 7, a game I loved, would have supported it.
It’s this kind of thing that makes the current crop of consoles’ inabilities to run at 1080p reliably all-the-more frustrating. Note that the experience I relayed above happened more than seven years ago. Yet, these seven years later, the brand-new consoles can’t handle that same (and admittedly modest) resolution reliably.
While a PC is going to be more expensive than an Xbox One out-of-the-gate (unless you really want to penny-pinch), the benefits are going to outweigh that initial extra cost. As GPUs like NVIDIA’s GeForce GTX 750 Ti (our review) have proven, cost-effective 1080p gaming is more than possible on the PC. The GTX 750 Ti costs $150, and not only do games look great on it, they run at 60 FPS (refer to the “Best Playable” page in that review). Today, the biggest challenge in building a capable PC for the same price as the Xbox One is the cost of the Windows license.
Meanwhile, console gamers have to wonder what exactly they’re going to be given when a new game comes out. Games that have been released since both of the consoles launched prove that there’s a major trade-off to be had when developers aim to bring a compelling game to market: Opt for 1080p/30, or noticeably reduce the game’s graphical detail. Admittedly, in this case, I’d say 720p should be the goal, focusing entirely on graphics fidelity at 30 FPS (ugh – painful).
Still, and understandably so, gamers want to see that 1080p resolution, and because of it, developers have been fighting themselves to figure out how to make it happen. In the case of Titanfall, the developer has even gone as far as to use non-standard resolutions to inch its way towards 1080p. I’d like to think that if you have to set your game to 792p, and possibly 900p for launch, it’s a sure sign that Microsoft and Sony should have done more to release their respective consoles with more graphics power.
Yes – increasing graphics power would have increased the price, but when developers face these kinds of hassles right out-of-the-gate, things are only going to look sillier as time goes on. And while developers will eke more performance out of both consoles over their lifespan, I don’t think we’re going to see gains like we have in previous generations. I am confident in saying that 1080p/30 will not be a common resolution for FPS titles when either of these consoles near the next launch. Meanwhile, PC hardware is going to continue to progress, and if a graphics card like NVIDIA’s GeForce GTX 750 Ti is a gauge to anything, the future is very bright for the platform.
Dare I even mention that the PC version of Titanfall supports DirectX 11, 3D, and 4K?