We present a mobile multi-touch interface for selecting, querying, and visually exploring data visualized on large, high-resolution displays. Although emerging large (e.g., ∼10 m wide), high-resolution displays provide great potential for visualizing dense, complex datasets, their utility is often limited by a fundamental interaction problem the need to interact with data from multiple positions around a large room. Our solution is a selection and querying interface that combines a hand-held multi-touch device with 6 degree-of-freedom tracking in the physical space that surrounds the large display. The interface leverages context from both the users physical position in the room and the current data being visualized in order to interpret multi-touch gestures. It also utilizes progressive refinement, favoring several quick approximate gestures as opposed to a single complex input in order to most effectively map the small mobile multi-touch input space to the large display wall. The approach is evaluated through two interdisciplinary visualization applications: a multi-variate data visualization for social scientists, and a visual database querying tool for biochemistry. The interface was effective in both scenarios, leading to new domain-specific insights and suggesting valuable guidance for future developers.
Bibliographical noteFunding Information:
Thanks to Dane Coffey, David Schroeder, Mike Knox, Nancy Rowe, and Birali Runesha for their technical expertise; Julia Drew for valuable data analysis sessions; and Brown University's Visualization Research Lab for assistance with early versions of the map visualization. This work was supported in part by the Minnesota Supercomputing Institute, the Laboratory for Computational Science and Engineering, and the University of Minnesota Grant-in-Aid program .
- 3D tracking
- 3D user interface
- Mobile device
- Progressive refinement
- Ray casting