TY - JOUR
T1 - Adapting user interfaces for gestural interaction with the flexible action and articulated skeleton toolkit
AU - Suma, Evan A.
AU - Krum, David M.
AU - Lange, Belinda
AU - Koenig, Sebastian
AU - Rizzo, Albert
AU - Bolas, Mark
PY - 2013
Y1 - 2013
N2 - We present the Flexible Action and Articulated Skeleton Toolkit (FAAST), a middleware software framework for integrating full-body interaction with virtual environments, video games, and other user interfaces. This toolkit provides a complete end-to-end solution that includes a graphical user interface for custom gesture creation, sensor configuration, skeletal tracking, action recognition, and a variety of output mechanisms to control third party applications, allowing virtually any PC application to be repurposed for gestural control even if it does not explicit support input from motion sensors. To facilitate intuitive and transparent gesture design, we define a syntax for representing human gestures using rule sets that correspond to the basic spatial and temporal components of an action. These individual rules form primitives that, although conceptually simple on their own, can be combined both simultaneously and in sequence to form sophisticated gestural interactions. In addition to presenting the system architecture and our approach for representing and designing gestural interactions, we also describe two case studies that evaluated the use of FAAST for controlling first-person video games and improving the accessibility of computing interfaces for individuals with motor impairments. Thus, this work represents an important step toward making gestural interaction more accessible for practitioners, researchers, and hobbyists alike.
AB - We present the Flexible Action and Articulated Skeleton Toolkit (FAAST), a middleware software framework for integrating full-body interaction with virtual environments, video games, and other user interfaces. This toolkit provides a complete end-to-end solution that includes a graphical user interface for custom gesture creation, sensor configuration, skeletal tracking, action recognition, and a variety of output mechanisms to control third party applications, allowing virtually any PC application to be repurposed for gestural control even if it does not explicit support input from motion sensors. To facilitate intuitive and transparent gesture design, we define a syntax for representing human gestures using rule sets that correspond to the basic spatial and temporal components of an action. These individual rules form primitives that, although conceptually simple on their own, can be combined both simultaneously and in sequence to form sophisticated gestural interactions. In addition to presenting the system architecture and our approach for representing and designing gestural interactions, we also describe two case studies that evaluated the use of FAAST for controlling first-person video games and improving the accessibility of computing interfaces for individuals with motor impairments. Thus, this work represents an important step toward making gestural interaction more accessible for practitioners, researchers, and hobbyists alike.
KW - Gesture
KW - Middleware
KW - Natural interaction
KW - User interfaces
KW - Video games
UR - http://www.scopus.com/inward/record.url?scp=84875477256&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84875477256&partnerID=8YFLogxK
U2 - 10.1016/j.cag.2012.11.004
DO - 10.1016/j.cag.2012.11.004
M3 - Article
AN - SCOPUS:84875477256
SN - 0097-8493
VL - 37
SP - 193
EP - 201
JO - Computers and Graphics (Pergamon)
JF - Computers and Graphics (Pergamon)
IS - 3
ER -