Rasterization or ray tracing? It’s a somewhat heated debate right now, because while most of the industry has no intention of going the ray tracing route, Intel is pushing for widespread adoption when their Larrabee platform launches. Although ray tracing delivers arguably better results in games and other 3D applications, it’s a real performance hog.
Because of Larrabee’s nature, however, Intel believes moving game renderers to ray tracing would be feasible thanks to the many cores available… cores that are slow at rendering DirectX and OpenGL APIs. We know Intel’s stand on things, but how about key industry figures?
PC Perspective sat down with Cevat Yerli, CEO of Crytek, to get some of his opinions. After reading, I think it’s quite clear… no one is gung-ho to move to a ray tracing method, and Cevat doesn’t see it becoming a reality for at least another five years. The biggest question right now is whether or not we even need ray tracing, or if the only one it matters to is Intel, since it’s their product that’s going to lack in driving current methods.
|
So far I haven’t seen a compelling example for using pure classical ray tracing. Part of the problem is that the theoretical argument is derived from looking at the performance of static geometry under certain assumptions for what sort of lighting environment and material types you have in the scene. These examples often compare a ray tracer using a sophisticated acceleration structure for storing static polygon data against a trivial implementation of a rasterization engine which draws every polygon in the scene, which produces an unfair advantage to the ray tracer.
Source: PC Perspective