Hanspeter Pfister, MERL - Mitsubishi Electric Research Laboratories
Matthias Zwicker, ETH Zürich, Switzerland
Jeroen van Baar, MERL- Mitsubishi Electric Research Laboratories
Markus Gross, ETH Zürich, Switzerland
Surfels: Surface Elements as Rendering Primitives
H. Pfister, M. Zwicker, J. van Baar, M. Gross, SIGGRAPH 2000
A Survey and Classification of Real Time Rendering Methods
M. Zwicker, M. Gross, H. Pfister
Technical Report No. 332, Computer Science Department, ETH Zürich, 1999.
The core objective of this project is to combine volume, image, and geometry rendering in one unified rendering pipeline. From a modeling point of view, the surfel representation provides a mere discretization of the geometry and hence reduces the object representation to the essentials needed for rendering. In a sense, a surfel relates to what has been called the lingua franca of rendering. Surfels will allow us to combine image-based rendering, adaptive curved-surface tessellation, constructive solid geometry, unstructured volume data, particle systems, and so on in one common rendering framework.
The surfel rendering pipeline complements the existing graphics pipeline and does not intend to replace it. It is positioned between conventional geometry-based approaches and image-based rendering and, as both of them, it trades memory overhead for rendering performance and quality. The focus of this work has been interactive 3D applications. Surfels work well for models with rich, organic shapes or high surface details and for applications where preprocessing is not an issue. These qualities make them ideal for interactive games.
In a preprocessing step, we sample the surfaces of complex geometric models along three orthographic views. An octree-based surfel representation of the geometric object is then computed. At the same time, we perform computation-intensive calculations such as texture, bump, or displacement mapping. Surfel attributes comprise depth, texture color, normal, and others. During rendering, a hierarchical forward warping algorithm projects surfels to the screen. By moving rasterization and texturing from the core rendering pipeline to the preprocessing step, we dramatically reduce the rendering cost. Storing normals, prefiltered textures, and other per surfel data enables us to build high quality rendering algorithms. Shading and transformations are applied per surfel and result in Phong illumination, bump and displacement mapping, as well as other advanced rendering features. For example, we implemented environment mapping and a painterly surfel rendering algorithm running at interactive frame rates. Our hierarchical forward projection algorithm allows us to estimate the surfel density per output pixel for speed-quality tradeoffs.
Hole detection and image reconstruction. a) Surfel object with holes. b) Hole detection (hole pixels in green). c) Image reconstruction with a Gaussian filter. Click on the images for large JPEGs.
Tilted checker plane. Reconstruction filter: a) Nearest neighbor. b) Gaussian filter. c) Supersampling.
Click on the images for QuickTime movies. Holding the mouse over an image will show the size of the QuickTime file.
Animations of rigid objects with 2x2 supersampling. 3 LODs and 3 surfel mipmap levels per object, objects are 3-to-1 reduced. Number of surfels per object in parenthesis: a) Wasp (204k) b) Chameleon (80k) c) Flak (2.8M) d) Cab (539k) e) Salamander (70k) f) Displacement mapped donut (2M).
Hole detection and image reconstruction. a) Surfel object with holes. b) Hole detection (hole pixels in green). c) Image reconstruction with a Gaussian filter.
Surfel object LODs. Surfel blocks from different levels of the LDC tree are colored differently. Red = level 0 (highest resolution), green = level 1, blue = level 2 (lowest resolution).
Surfel mipmap levels. Surfels in different levels of the surfel mipmap are color coded. Red = mipmap level 0 (highest texture resolution), green = mipmap level 1, blue = mipmap level 2 (lowest texture resolution). Notice the linear interpolation between levels.