A Look at NVIDIA’s Most Ambitious SoC Yet: Tegra K1

Print
by Rob Williams on January 6, 2014 in Mobile, Trade Shows

NVIDIA impressed at CES 2013 with the unveiling of its Tegra 4 processor, and as is now super-clear, it wanted to blow people’s socks off at the latest event. With Tegra K1, NVIDIA caters to gamers, developers, and believe it or not, car manufacturers. Let’s investigate and find out just what it brings to the table.

Going into this CES, I had anticipated that NVIDIA would announce its Tegra 4 successor – as this article suggests, it has. What I didn’t expect, however, was to be overwhelmed with the sheer amount of impressive tech that NVIDIA CEO Jen-Hsun Huang talked about over the course of his 90 minute presentation.

Just the other night, I mentioned to someone that mobile processor advancements are hitting us at as rapid a pace as what we saw on the desktop side a decade ago. Actually, the current pace of mobile development is undoubtedly faster.

In 2010, NVIDIA’s dual-core Tegra 2 was introduced, while in 2011, the quad-core Tegra 3 was. At last year’s CES, the company showed off an impressive evolution, Tegra 4, which featured a then-staggering 72 CUDA cores. Given the naming scheme we’ve seen up to this point, it was assumed “Tegra 5” would be the chip getting announced at this CES – but not so. As Jen-Hsun stated, Tegra K1 is a major advancement; to call it “Tegra 5” would be inappropriate. “It’s simply not linear.

Tegra K1 192-core SoC - CES 2014

NVIDIA’s touting Tegra K1 as being a 192-core “Super Chip”, although that’s a little misleading. Like Tegra 4, K1 has four main processing cores (+1 low-powered “companion” core), but the number of CUDA cores has been bumped from 72 to 192. Like Tegra 4i, K1 will be clocked at “up to” 2.3GHz. Its “Dual Denver” variant, which we’ll talk about a bit later, will be clocked at 2.5GHz.

K1’s CUDA cores are based on the same architecture as NVIDIA’s current GeForce GPUs, Kepler. While the CPU architecture (ARM) is dissimilar to what we use on our full-fledged desktops and (most) notebooks (x86), NVIDIA’s hinting at the fact that its mobile chip is now comparable to its desktop ones.

Tegra K1 Kepler - CES 2014

To help prove that fact, Jen-Hsun hauled an amazing rabbit out of his hat. With a quote from EPIC Games’ founder and CEO Tim Sweeney, we learned that Unreal Engine 4 will support Tegra K1. This is most notable for the fact that UE4’s graphical capabilities are bleeding-edge, but more important: this is being announced before the engine’s release.

Now, for the most important: Because Tegra K1 is Kepler-based, it supports OpenGL – it’s not limited to OpenGL ES. For end-users, this means better-looking and unrestricted games. For developers, it means games will be much easier to port from PC to mobile. This could prove to be an absolute game-changer.

Following the announcement, Jen-Hsun made some interesting comparisons. In 2004-2005, the Xbox 360 and PlayStation 3 were released – both of which were DirectX 9 class products. To achieve the same level of graphics on mobile (using UE3), we had to wait until 2010, when EPIC released a demo for iOS. By contrast, UE4 will support Tegra K1 out-of-the-gate, because the power is here now.

Tegra K1 Unreal Engine Comparison - CES 2014

With UE4 and K1’s graphical capabilities in general, NVIDIA promises some important GPU technologies to be made proper use of in the near future. These include features like HDR, compute shaders, and one that NVIDIA talked about a lot at its press conference: Global Illumination.

The goal of global illumination is to create realistic lighting in a scene, a task that can eat a lot of GPU resources but greatly enhances the look and feel of an environment. In a homebrewed demo called Meteor, NVIDIA showed GI off with the help of a futuristic space station. At some points, GI was turned on and off, and the difference was stark – with it turned on, the scene looked far more realistic, and thus more believable.

Tegra K1 Graphics Demo - CES 2014

With another demo, this time involving a standard livingroom, we could better gauge just how important global illumination is. As the below screenshots show, the room has realistic lighting, and the texture detail of the couch is something we’d expect to see on the desktop, not mobile (and note how the light bounces off of the couch’s leather).

NVIDIA Tegra K1 - Global Illumination Demo 1

NVIDIA Tegra K1 - Global Illumination Demo 2

As it so often does, EPIC also had a demo to show off. In it, reflective surfaces were the main focus, all based around the use of global illumination. After the demo, I couldn’t help but think one thing: Those same graphical features would be amazing on the desktop. Yet, here we’re dealing with mobile. Fortunately, this is one demo you can see for yourself:

Fans of Frozenbyte’s Trine series might find one other demo to be rather impressive:

Tegra K1 Trine 2 Running on Android - CES 2014

That’s right – Trine 2, one of the prettiest platformers ever, can run on K1. Given the fact that the game isn’t even available for Android at the moment, I think this demo proves that it’s en route – no doubt as a flagship launch game for K1.

I can’t help but share a slide to show how NVIDIA compares its K1 to Apple’s top-end A7 chip. On GFXBench 3.0’s Manhattan test, the K1 proved about 2.6x faster. Oh – and in case it wasn’t obvious before, A7 doesn’t support Unreal Engine 4 (nice little jab there, NVIDIA).

Tegra K1 Performance Versus iPhone A7 - CES 2014

With one last slide, NVIDIA says that its Tegra K1 is far more powerful than the Xbox 360 and PlayStation 3 on the GPU and CPU front, while it draws about 1/20th of their total power.

NVIDIA Tegra K1 Comparison to Consoles

I mentioned earlier that NVIDIA will be offering a “Dual Denver” variant of the K1, and while it was discussed for just a couple of minutes, it’s without question one of the most important aspects of this presentation.

Denver is, in effect, NVIDIA’s first CPU. It’s based on ARM’s V8 design, which means it sports a native 64-bit architecture. NVIDIA’s tuned Denver for super-fast performance on both the single and multi-thread fronts, and it one-ups the quad-core A15-based K1 by adding another 200MHz to the clock. The CPU cores in both K1s can’t be easily compared, however, so in certain workloads, NVIDIA’s Denver could prove faster clock-for-clock over A15.

Tegra K1 Models - CES 2014

I assume that NVIDIA is keeping the Denver-based K1 to a dual-core design because of its high-performance focus – a quad-core might mean a serious drain on battery-life. That being the case, it seems the Denver-based chip would be best-suited for those who want the highest performance on single-threaded or lightly multi-threaded scenarios, whereas the quad-core A15 variant might be considered more well-rounded.

That said, the Dual Denver chip features a 7-way superscalar design – up from 3-way – which could help make up for its lack of extra cores in certain scenarios. Further, the Denver design also gets a massive boost to cache; 128K+64K, up from 32K+32K.

To prove that Dual Denver wasn’t just a paper launch, a basic demo of an Android desktop was shown. Jen-Hsun mentioned that the chip being used was hot off the press, so the demo was kept simple as a result. The company did however launch CPU-Z to prove to the world that it was in fact Denver under the hood (the architecture was listed as ‘aarch64’).

Jen-Hsun is a man who loves his cars, and so it’s of little surprise that his company’s Tegra chip is going to find its way into some future models. A special chip has been designed for the task, called K1 VCM, and NVIDIA touts it as being “supercomputing in your car”. While visual computing possibilities no doubt come to mind here, the chip could play very important roles as well. Take for example the image Tegra would process on-the-fly while you’re driving down a highway:

NVIDIA Tegra K1 VCM

Thanks to its processing power, K1 would be able to detect potential hazards (even small ones such as driving too fast based on the number of car lengths between you and the vehicle ahead), and be able to determine the proper speed limit by reading signs on the side of the road. This of course all ties into the autonomous vehicle, an area NVIDIA is keen to make a serious splash in.

For the more “fun” side of what Tegra K1 could bring to vehicles, NVIDIA demoed an information dash that looks real, but isn’t. It’d be displayed on a high-res screen, and would be entirely customizable (mostly up to the vehicle vendor, but I’m sure the user would have good control as well).

NVIDIA Project Mercury

The entire kit would consist of the Tegra K1 chip and a custom motherboard, and then the display itself. During the demo, it was shown that every aspect of the dash could be customized. Different materials could be chosen for various parts, and in all, vendors could go to town with different designs.

NVIDIA Project Mercury Hardware

NVIDIA talked about more than just Tegra at its press conference, but as this article was focused entirely on it (and you can probably see why), we’ll cover the other stuff in our news section soon.

If there’s one takeaway here, it’s that NVIDIA is damned serious about its mobile hardware, and if there’s a market that would be well-served by it, it’ll go after it.

Support our efforts! With ad revenue at an all-time low for written websites, we're relying more than ever on reader support to help us continue putting so much effort into this type of content. You can support us by becoming a Patron, or by using our Amazon shopping affiliate links listed through our articles. Thanks for your support!

Rob Williams

Rob founded Techgage in 2005 to be an 'Advocate of the consumer', focusing on fair reviews and keeping people apprised of news in the tech world. Catering to both enthusiasts and businesses alike; from desktop gaming to professional workstations, and all the supporting software.

twitter icon facebook icon instagram icon