Light stage
A light stage or light cage is an instrumentation set-up used for shape, texture, reflectance and motion capture often with structured light and a multi-camera setup.
Reflectance capture
The reflectance field over a human face was first captured in 2000 by Paul Debevec et al. The method they used to find the light that travels under the skin was based on the existing scientific knowledge that light reflecting off the air-to-oil retains its polarization while light that travels under the skin loses its polarization.[1]
Using this information and the simplest, yet most revolutionary to date, light stage was built by Debevec et al. and it consisted of
- Moveable digital camera
- Moveable simple light source (full rotation with adjustable radius and height)
- 2 polarizers set into various angles in front of the light and the camera
- A computer with relatively simple programs doing relatively simple tasks.
[1] The setup enabled the team to find the subsurface scattering component of the BSDF over the human face which was required for fully virtual cinematography with ultra-photorealistic digital look-alikes like seen in the Matrix Reloaded and Matrix Revolutions and numerous other movies since the early 2000s.
Following great scientific success Debevec et al. constructed a further 6 newer more elaborate versions of the light stage at the University of Southern California (USC) Institute for Creative Technologies (ICT) and Ghosh et al. built the USC light stage X, the seventh version. In 2014 President Barack Obama had his image and reflectance captured with the USC mobile light stage. [2]
See
- Digital Emily presented to the SIGGRAPH convention in 2008 for which the reflection field of actress Emily O'Brien was captured using the USC light stage 5.[3] and the prerendered digital look-alike was made in association with Image Metrics. Video includes USC light stage 5 and USC light stage 6.
- Digital Ira that runs in precomputed but also is fairly convincingly rendered also in real-time was presented at the 2013 SIGGRAPH in association with Activision.[4] Digital Emily shown in 2008 was a pre-computed simulation meanwhile Digital Ira run in real-time in 2013 and is fairly realistic looking even in real-time rendering of animation. The field is rapidly moving from movies to computer games and leisure applications – Video includes USC light stage X.
- The Presidential Portrait by USC ICT in conjunction with the Smithsonian Institution was done using the latest mobile light stage. It included texture, feature and reflectance capture with high resolution multi-camera setup and also additional hand held scanners. A 3D printed bust of the President was also produced.
References
- 1 2 Debevec, Paul; "Tim Hawkins, Chris Tchou, Haarm-Pieter Duiker, Westley Sarokin, Mark Sagar" (2000). "Acquiring the reflectance field of a human face". ACM. doi:10.1145/344779.344855. Retrieved 2013-07-21. Cite uses deprecated parameter
|coauthors=
(help) - ↑ "Scanning and printing a 3D portrait of President Barack Obama". University of Southern California. 2013. Retrieved 2015-11-04.
- ↑ Paul Debevec animates a photo-real digital face - Digital Emily 2008
- ↑ Debevec, Paul (2013). "Digital Ira - A real-time animatable face demonstration". His web site. University of Southern California. Retrieved 2013-08-10.