Differentiable Event Stream Simulator for Non-Rigid 3D Tracking
Abstract
This paper introduces the first differentiable simulator of event streams, i.e., streams of asynchronous brightness change signals recorded by event cameras. Our differentiable simulator enables non-rigid 3D tracking of deformable objects (such as human hands, isometric surfaces and general watertight meshes) from event streams by leveraging an analysis-by-synthesis principle. So far, event-based tracking and reconstruction of non-rigid objects in 3D, like hands and body, has been either tackled using explicit event trajectories or large-scale datasets. In contrast, our method does not require any such processing or data, and can be readily applied to incoming event streams. We show the effectiveness of our approach for various types of non-rigid objects and compare to existing methods for non-rigid 3D tracking. In our experiments, the proposed energy-based formulations outperform competing RGB-based methods in terms of 3D errors. The source code and the new data are publicly available.
Downloads
Conference Materials
Citation
@inproceedings{Nehvi2021, title={Differentiable Event Stream Simulator for Non-Rigid 3D Tracking}, author={Jalees Nehvi and Vladislav Golyanik and Franziska Mueller and Hans-Peter Seidel and Mohamed Elgharib and Christian Theobalt}, booktitle = {CVPR Workshop on Event-based Vision}, year={2021} }
Acknowledgments
This work was supported by the ERC Consolidator Grant 770784.
Contact
For questions, clarifications, please get in touch with:Vladislav Golyanik
golyanik@mpi-inf.mpg.de
Mohamed Elgharib
elgharib@mpi-inf.mpg.de