TY - GEN
T1 - Instant Mixed Reality Lighting from Casual Scanning
AU - Richter-Trummer, Thomas
AU - Kalkofen, Denis
AU - Park, Jinwoo
AU - Schmalstieg, Dieter
PY - 2016/12/12
Y1 - 2016/12/12
N2 - We present a method for recovering both incident lighting and surface materials from casually scanned geometry. By casual, we mean a rapid and potentially noisy scanning procedure of unmodified and uninstrumented scenes with a commodity RGB-D sensor. In other words, unlike reconstruction procedures which require careful preparations in a laboratory environment, our method works with input that can be obtained by consumer users. To ensure a robust procedure, we segment the reconstructed geometry into surfaces with homogeneous material properties and compute the radiance transfer on these segments. With this input, we solve the inverse rendering problem of factorization into lighting and material properties using an iterative optimization in spherical harmonics form. This allows us to account for self-shadowing and recover specular properties. The resulting data can be used to generate a wide range of mixed reality applications, including the rendering of synthetic objects with matching lighting into a given scene, but also re-rendering the scene (or a part of it) with new lighting. We show the robustness of our approach with real and synthetic examples under a variety of lighting conditions and compare them with ground truth data.
AB - We present a method for recovering both incident lighting and surface materials from casually scanned geometry. By casual, we mean a rapid and potentially noisy scanning procedure of unmodified and uninstrumented scenes with a commodity RGB-D sensor. In other words, unlike reconstruction procedures which require careful preparations in a laboratory environment, our method works with input that can be obtained by consumer users. To ensure a robust procedure, we segment the reconstructed geometry into surfaces with homogeneous material properties and compute the radiance transfer on these segments. With this input, we solve the inverse rendering problem of factorization into lighting and material properties using an iterative optimization in spherical harmonics form. This allows us to account for self-shadowing and recover specular properties. The resulting data can be used to generate a wide range of mixed reality applications, including the rendering of synthetic objects with matching lighting into a given scene, but also re-rendering the scene (or a part of it) with new lighting. We show the robustness of our approach with real and synthetic examples under a variety of lighting conditions and compare them with ground truth data.
KW - [Information Interfaces and Presentation
KW - Artificial
KW - augmented
KW - virtual realities
KW - Image Processing and Computer Vision
KW - Photometric registration
KW - 3D Reconstruction
UR - http://www.scopus.com/inward/record.url?scp=85010403043&partnerID=8YFLogxK
U2 - 10.1109/ISMAR.2016.18
DO - 10.1109/ISMAR.2016.18
M3 - Conference contribution
AN - SCOPUS:85010403043
T3 - Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2016
SP - 27
EP - 36
BT - Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2016
A2 - Broll, Wolfgang
A2 - Saito, Hideo
A2 - Swan, J. Edward
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 15th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2016
Y2 - 19 September 2016 through 23 September 2016
ER -