Isaac Wang, Pradyumna Narayana, Jesse Smith, Bruce Draper, Ross Beveridge, and Jaime Ruiz. 2018. In Proceedings of the 23rd International Conference on Intelligent User Interfaces (IUI ’18). ACM, New York, NY, USA. DOI: https://doi.org/10.1145/3172944.3173003
Tools for Gestural Analysis
In order to conduct research on gesture-based interactions, a focus of our work is on developing tools to support gestural analysis. This includes tools to assist in the annotation, analysis, and visualization of gesture data, with an emphasis on creating tools that automate part of the work and thus leverage both machine and human capabilities to complete tasks.
EASEL Annotation Tool
EASEL, the Easy Automatic Segmentation Event Labeler, is a video annotation tool aiming to streamline and automate the annotation process. Video annotation is a vital part of research examining gestural and multimodal interaction as well as computer vision, machine learning, and interface design. However, annotation is a difficult, time-consuming task that requires high cognitive effort. Existing tools for labeling and annotation still require users to manually label most of the data, limiting their helpfulness. EASEL streamlines the annotation process by introducing assisted annotation, using automatic gesture segmentation and recognition to automatically annotate gestures.
People
Funding
Communication through Gestures, Expression and Shared Perception