Abstract
This paper presents MONET - -an end-to-end semi-supervised learning framework for a keypoint detector using multiview image streams. In particular, we consider general subjects such as non-human species where attaining a large scale annotated dataset is challenging. While multiview geometry can be used to self-supervise the unlabeled data, integrating the geometry into learning a keypoint detector is challenging due to representation mismatch. We address this mismatch by formulating a new differentiable representation of the epipolar constraint called epipolar divergence - -a generalized distance from the epipolar lines to the corresponding keypoint distribution. Epipolar divergence characterizes when two view keypoint distributions produce zero reprojection error. We design a twin network that minimizes the epipolar divergence through stereo rectification that can significantly alleviate computational complexity and sampling aliasing in training. We demonstrate that our framework can localize customized keypoints of diverse species, e.g., humans, dogs, and monkeys.
Original language | English (US) |
---|---|
Title of host publication | Proceedings - 2019 International Conference on Computer Vision, ICCV 2019 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 753-762 |
Number of pages | 10 |
ISBN (Electronic) | 9781728148038 |
DOIs | |
State | Published - Oct 2019 |
Event | 17th IEEE/CVF International Conference on Computer Vision, ICCV 2019 - Seoul, Korea, Republic of Duration: Oct 27 2019 → Nov 2 2019 |
Publication series
Name | Proceedings of the IEEE International Conference on Computer Vision |
---|---|
Volume | 2019-October |
ISSN (Print) | 1550-5499 |
Conference
Conference | 17th IEEE/CVF International Conference on Computer Vision, ICCV 2019 |
---|---|
Country/Territory | Korea, Republic of |
City | Seoul |
Period | 10/27/19 → 11/2/19 |
Bibliographical note
Funding Information:We thank David Crandall for his support and feedback. This work is supported by NSF IIS 1846031.