Game projects‎ > ‎

Cloud Rendering

Time: May - December 2010
Type: Rendering Engine, Cloud Service

This is a project under supervision of Carnegie Mellon University Prof. Adrien Treuille.

The long-run goal of this project is to develop a powerful server to provide high graphcis quality rendering models with millions of polygons. So we can send the rendered gameplay screen image to client with poor devices like a iPhone or Netbooks. They even do not need to install the game to play it, as long as they can receive video streams and decompress it in real time.

I have finished an image-based rendering algorithm called lightfield with GLSL in summer 2011, and then switched to CUDA in Fall 2010. We developed a client/server pipeline system to render the scene. Before l left CMU CG Lab, it can render 300 frames per second in CMU cluster network.

Also, in order to save GPU memory, we implemented image compression/decompression algorithm with CUDA. The compression is pre-computed. In runtime, the system will decompress images for rendering. so we can load compressed image data directly to GPU, saving almost 90% of original RGBA textures size in GPU memory.

Since this project is intent to launch a company, I cannot say too much detail about it. If you want to know more details, please contact Prof. Adrien, or ask me question during interview :P

An early rendering result with 800K polygons tree seen from a triangle window

A chestnut rendering result image with grass

Some resources about lightfield: