Abstract
In this paper, we propose a fully differentiable pipeline for estimating accurate dense correspondences between 3D point clouds. The proposed pipeline is an extension and a generalization of the functional maps framework. However, instead of using the Laplace-Beltrami eigenfunctions as done in virtually all previous works in this domain, we demonstrate that learning the basis from data can both improve robustness and lead to better accuracy in challenging settings. We interpret the basis as a learned embedding into a higher dimensional space. Following the functional map paradigm the optimal transformation in this embedding space must be linear and we propose a separate architecture aimed at estimating the transformation by learning optimal descriptor functions. This leads to the first end-to-end trainable functional map-based correspondence approach in which both the basis and the descriptors are learned from data. Interestingly, we also observe that learning a canonical embedding leads to worse results, suggesting that leaving an extra linear degree of freedom to the embedding network gives it more robustness, thereby also shedding light onto the success of previous methods. Finally, we demonstrate that our approach achieves state-of-the-art results in challenging non-rigid 3D point cloud correspondence applications.
Original language | English (US) |
---|---|
Title of host publication | 34th Conference on Neural Information Processing Systems, NeurIPS 2020 |
Publisher | Neural information processing systems foundation |
State | Published - Jan 1 2020 |
Externally published | Yes |
Bibliographical note
KAUST Repository Item: Exported on 2022-07-01Acknowledged KAUST grant number(s): CRG-2017-3426
Acknowledgements: The authors would like to thank the anonymous reviewers for their detailed feedback and suggestions. Parts of this work were supported by the KAUST OSR Award No. CRG-2017-3426, the ERC Starting Grant No. 758800 (EXPROTEA) and the ANR AI Chair AIGRETTE.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.