Abstract
We presents a method for future localization: to predict plausible future trajectories of ego-motion in egocentric stereo images. Our paths avoid obstacles, move between objects, even turn around a corner into space behind objects. As a byproduct of the predicted trajectories, we discover the empty space occluded by foreground objects. One key innovation is the creation of an EgoRetinal map, akin to an illustrated tourist map, that 'rearranges' pixels taking into accounts depth information, the ground plane, and body motion direction, so that it allows motion planning and perception of objects on one image space. We learn to plan trajectories directly on this EgoRetinal map using first person experience of walking around in a variety of scenes. In a testing phase, given an novel scene, we find multiple hypotheses of future trajectories from the learned experience. We refine them by minimizing a cost function that describes compatibility between the obstacles in the EgoRetinal map and trajectories. We quantitatively evaluate our method to show predictive validity and apply to various real world daily activities including walking, shopping, and social interactions.
Original language | English (US) |
---|---|
Title of host publication | Proceedings - 29th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2016 |
Publisher | IEEE Computer Society |
Pages | 4697-4705 |
Number of pages | 9 |
ISBN (Electronic) | 9781467388504 |
DOIs | |
State | Published - Dec 9 2016 |
Event | 29th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2016 - Las Vegas, United States Duration: Jun 26 2016 → Jul 1 2016 |
Publication series
Name | Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition |
---|---|
Volume | 2016-December |
ISSN (Print) | 1063-6919 |
Conference
Conference | 29th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2016 |
---|---|
Country/Territory | United States |
City | Las Vegas |
Period | 6/26/16 → 7/1/16 |
Bibliographical note
Publisher Copyright:© 2016 IEEE.