[From official texts]
Monday 12 December | 09:00-12:45 | Room S222
With the advancement in 3D display technologies, we anticipate that the next move in the display industries will gear toward autostereoscopy for multiple users. Such a new paradigm will ask for a novel approach to acquiring, processing and synthesizing a real 3D scene at arbitrary viewing directions. As one of the candidates for future 3D imaging technology, Time-of-Flight (ToF) depth camera receives great attention from various researchers and has been adopted in several topics. In this tutorial, we introduce the recent advancement of 3D depth sensing technologies and provide its potential use in 3D imaging for future 3D display. Motivated by recent progress in depth image processing and inverse rendering algorithms, we organize this timely and unique tutorial to introduce basic principles, in-depth discussion on the cutting edge of technical issues, potential ideas, challenges. This tutorial will provide well organized presentations on: the principle of ToF Depth sensor and its sensing architecture issues; the state of the art depth processing algorithms; 3D reconstruction from color and depth images and the lighting and reflectance extraction from color and depth images.