Sunday, August 9, 2009

The race for real-time ray tracing (Siggraph 2009)

Last Siggraph has shown that a lot of the big companies in 3D are heavily involved in realtime and interactive raytracing research.

Nvidia: OptiX, mental images (RealityServer, iray)

Intel: LRB

AMD: nothing AMD-specific announced yet

Caustic Graphics: CausticRT, BrazilRT, integration in 3DStudioMax Design 2010, LightWork Design, Robert McNeel & Associates, Realtime Technology AG (RTT AG), Right Hemisphere and Splutterfish

Then there was also this extremely impressive demonstration of V-Ray RT for GPU's, which caught many by surprise:

video: https://www.youtube.com/watch?v=DJLCpS107jg



and http://www.spot3d.com/vrayrt/gpu20090725.mov

V-Ray RT rendering on a GTX 285 using CUDA (will be ported to OpenCL): rendering a Cornell box with 5 bounces of physically correct global illumination at 40 fps, and a 800k polygon Collosseum with 5 GI bounces at around 4 fps (with progressive rendering). Since it will be ported to OpenCL, GPU's from AMD and Intel will be able to run it as well.

The Chaos Group and mental images presentations indicate that rendering is going to move from the CPU to the GPU very soon and will become increasingly realtime.

Caustic Graphics would like to see their cards end up in a next-gen console as they target gaming as well. During their presentation at the Autodesk booth, they also mentioned cloud gaming as an option: put a lot of Caustic accelerators in a server and create a fully raytraced game rendered server side. Chaos Group could do the same in fact: they could use their V-Ray RT GPU tech in a cloud rendering environment and make a photorealistic game doing realtime raytracing on a bunch of GPUs (V-Ray RT GPU supports multi-GPU tech and distributed rendering), with fully accurate and dynamic GI rendered on the fly. And if they don't do it, I'm sure mental images will with iray and Realityserver.

There were also some interesting presentations about the future of realtime graphics and alternative rendering pipelines, which all suggested a bigger focus on ray tracing and REYES and indicated that pure rasterization will become less important in the not so distant future.

On the game development front, Tim Sweeney (and of course Carmack with the SVO stuff) is exploring ray tracing/ray casting for his next generation engines: http://news.cnet.com/8301-13512_3-10306215-23.html
and
http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf

According to Sweeney, next generation rendering pipelines could include a mix of ray tracing, REYES, voxel raycasting and other volume rendering techniques, all implemented in a GPGPU language: REYES for characters and other dynamic objects, voxels for the static environment and foliage, raytracing for reflection and refraction and maybe some kind of global illumination.

No comments: