Things are just about ready to start-up at the GPU Technology Conference (GTC) held in San Jose, and NVIDIA is already starting the announcements. Virtual Reality (VR) is not just an upcoming feature explosion in the gaming community, but also with professional creatives; VR isn’t just about graphics either.
NVIDIA has documented upcoming technologies over the past year relating to VR, but this is the first time that developers will have free access to them. VRWorks is an SDK in much the same way as GameWorks, providing tools and plugin support for a variety of processing techniques relating to VR. The two new features that are now documented in the SDK are VR audio and 360° video.
VR Audio
Ordinarily, when looking at VR, graphics is the first thing to come to mind, but in simulated environments, visuals are only half the equation for immersion, audio is just as important too. Audio processing is largely forgotten about in games, for the simple reason that it works so well that you forget that there is any environment processing going on in the first place.
Pitch shifting, spatial 3D mapping, real-time effects and such all happen in the background and you don’t even notice it. Each sound sounds like a pre-recorded sample played on queue. Quite often, it’s the same sound played with various filters, even adapting to the terrain with reverb, and this is all handled by the game engine in real-time.
VRWorks audio goes beyond this simple filtering layer and introduces material effects into the process. Much like ray tracing light, VRWorks audio traces the path of the audio signal and distorts it based on the surfaces and materials it reflects off of, including doppler effects for moving objects. This can all be calculated within a ray-traced engine using NVIDIA’s OptiX ray tracing for live renders of buildings, or inside a game with the help of a plugin for Epic’s Unreal Engine 4 (available on github).
4K 360° Video
Games are not the only point of interest when it comes to VR content, video is too. Not just for stereoscopic display with 3D depth, but also immersive video which you can freely explore and look around. The problem is that capturing video in an all-encompassing 360 degree feed is, well, very difficult.
On the hardware side of things, multiple cameras are required to capture the different angles; anything up to 32 are required for a full image with limited distortion. The second problem is aligning it all, and this is a lot more complex than you’d believe.
Ever taken a panoramic shot with your phone or camera? You’ll often see small distortions in the images where each photo is blended into the others. Objects don’t align, colors are off, soft and blurry edges, then there is the warping element to form a sphere out of these flat photos. These tasks are computationally intensive as they are, but if you intend to do that with eight or more 4K video cameras in stereo at 30-60 FPS, you will need a lot silicon.
NVIDIA is opening up its 360 Video SDK so that video professionals will have a fixed framework and standard with which to start playing, saving a lot of time designing some in-house solution. There is currently a demonstration going on at GTC if you are attending, showing two NIVIDIA Quadro P6000 GPUs stitching together 8x 4K cameras in stereo using Z CAM’s V1 PRO VR camera system.
VRWorks 360 Video SDK for mono will be available as a public beta from the VR Works website. 360 Video for stereo will be some time later (no date given just yet).
Jamie has been abusing computers since he was a little lad. What began as a curiosity quickly turned into an obsession. As senior editor for Techgage, Jamie handles content publishing, web development, news and product reviews, with a focus on peripherals, audio, networking, and full systems.