Real-World Test Sequences


Introduction

Here, we provide interested researchers a real-world multi-view test data set captured in the blue-c portals. The data is meant to be used for testing reconstruction/rendering algorithms based on multi-view video sequences with consideration of noise and other capturing device errors. Furthermore, it can be used for testing streaming and compression techniques since we are also providing depth maps as calculated by a shape-from-silhouettes method.

The set consists of frame sequences rendered from 16 different camera views located in a hemisphere around the scene, background images, segmentation masks, depth images, and camera parameters. The recorded scene contains different humans performing various motions, ranging from simple and slow movements to kicks and punchs of a Kung-Fu fighter. The sequences have different lengths and are recorded at different acquisition frame rates. Each frame is saved as a 640 x 480 image. Here is an illustration of the blue-c portal at ETH Hönggerberg where all sequences are captured. The numbers indicate the camera nodes (i.e., arctic1 - arctic16).

Bird's eye:

Side views:

Quicktime VR inside-out:

cave inside half-transparent

Download

Test sequences are available upon request. Please write to wwwgraphinf.ethz.ch for more information.

Camera Calibration

Calibration data from our multi-camera self-calibration procedure can be found here. You can also do your own calibration by downloading the calibration sequence. The parameters are stored according to:

Files (e.g., "arctic3.cal")

Description

xxx.cal Projection matrix (3x3) & camera center (3x1) for camera xxx, incl. extrinsic & intrinsic parameters. Can be directly used to calculate rays from the camera center through pixels in the image plane.
xxx.mat 3x4 euclidean projection matrix for camera xxx which encodes all linear parameters. You can use function ./MultiCamValidation/CoreFunctions/P2KRtC.m of the multi-camera self-calibration package to decompose it into the intrinsic parameters in 3x3 K matrix, rotation matrix R and translation vector t, as well as the position of the camera center C in the common world coordinate system.
xxx.rad Parameters needed for undoing radial distortion for camera xxx. Intrinsic parameters in 3x3 K matrix, and image distortion coefficients (only radial distortions!) in 4x1 vector kc. They should be directly applicable in the OpenCV implementation of the undoing non-linear distortion.

We also provide graphical outputs for understanding the setup and calibration accuracy (which is mostly around 0.2 pixels reprojection error).

     Download calibration data

MPEG Test Sequences


Lara Sequence

a.k.a. "Kung-Fu Girl". You can find the MPEG synthetic data set provided by the GrOVis group of the Max-Planck-Institut für Informatik at:

     http://www.grovis.de/kungfu

The data is meant to be used for testing reconstruction/rendering algorithms based on multi-view video sequences without consideration of noise and other capturing device errors found in real-world data. The set consists of frame sequences rendered from 25 different camera views located in a hemisphere around the scene, background images and camera parameters. The recorded scene contains a humanoid figure, animated by motion-captured data, and is rendered in three flavours: foreground with textured, shaded and empty backgrounds respectively. In addition, they also provide a pure silhouette set. The sequences are 200 frames long and each frame is saved as a 320x240 image.

 
  Home Info

Back to top
© SW, 2004
Last Update: 05.01.2004