Abstract

Achieving photorealistic 3D view synthesis and relighting of human portraits is pivotal for advancing AR/VR applications. Existing methodologies in portrait relighting demonstrate substantial limitations in terms of generalization and 3D consistency, coupled with inaccuracies in physically realistic lighting and identity preservation. Furthermore, personalization from a single view is difficult. to achieve and often requires multiview images during the testing phase or involves slow optimization processes. This paper introduces Lite2Relight , a novel technique that can predict 3D consistent head poses of portraits while performing physically plausible light editing at interactive speed. Our method uniquely extends the generative capabilities and efficient volumetric representation of EG3D, leveraging a lightstage dataset to implicitly disentangle face reflectance and perform relighting under target HDRI environment maps. By utilizing a pre-trained geometry-aware encoder and a feature alignment module, we map input images into a relightable 3D space, enhancing them with a strong face geometry and reflectance prior. Through extensive quantitative and qualitative evaluations, we show that our method outperforms the state-of-the-art methods in terms of efficacy, photorealism, and practical application. This includes producing 3D-consistent results of the full head, including hair, eyes, and expressions. Lite2Relight paves the way for large scale adoption of photorealistic portrait editing in various domains, offering a robust, interactive solution to a previously constrained problem.

Teaser Video

     

Results

Downloads


Citation

BibTeX, 1 KB

@article{prao2024lite2relight,
title = {Lite2Relight: 3D-aware Single Image Portrait Relighting},
author = {Rao, Pramod and Fox, Gereon and Meka, Abhimitra and B R, Mallikarjun and Zhan, Fangneng and Weyrich, Tim and Bickel, Bernd and Seidel, Hans-Peter and Pfister, Hanspeter and Matusik, Wojciech and Elgharib, Mohamed and Theobalt, Christian },
booktitle = {ACM SIGGRAPH 2024 Conference Proceedings},
year={2024}
}
				

Acknowledgments

This work was supported by the ERC Consolidator Grant 4DReply (770784).

Contact

For questions, clarifications, please get in touch with:
Pramod Rao
prao@mpi-inf.mpg.de

Imprint. Data Protection.