Gesture-based Interactions

A main focus of research has been exploring the use of gestures in various types of interactions. Below are some of the current projects in this area.

The Ruiz HCI Lab has had several projects that have examined the use of novel gesturing paradigms to interact with smartphones and smartwatches. This work includes our work using motion gestures (i.e., physically moving a device), gestures on the back of a device, and gestures around a device.
more >>
Personal assistants such as Siri have changed the way people interact with computers by introducing virtual assistants that collaborate with humans through natural speech-based interfaces. However, relying on speech alone as the medium of communication can be a limitation; non-verbal aspects of communication also play a vital role in natural human discourse. Thus, it is necessary to identify the use of gesture and other non-verbal aspects in order to apply them towards the development of computer systems. The goal of this project is to understand how systems can model non-verbal communication as a means to mimic human-human communication.
more >>
In order to conduct research on gesture-based interactions, a focus of our work is on developing tools to support gestural analysis. This includes tools to assist in the annotation, analysis, and visualization of gesture data, with an emphasis on creating tools that automate part of the work and thus leverage both machine and human capabilities to complete tasks.
more >>

Exploring User-Defined Back-Of-Device Gestures for Mobile Devices
Shaikh Shawon Arefin Shimon, Sarah Morrison-Smith, Noah John, Ghazal Fahimi, and Jaime Ruiz. 2015. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '15). ACM, New York, NY, USA, 227-232. DOI: https://doi.org/10.1145/2785830.2785890

Exploring Non-touchscreen Gestures for Smartwatches
Shaikh Shawon Arefin Shimon, Courtney Lutton, Zichun Xu, Sarah Morrison-Smith, Christina Boucher, and Jaime Ruiz. 2016. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3822-3833. DOI: https://doi.org/10.1145/2858036.2858385

EASEL: Easy Automatic Segmentation Event Labeler
Isaac Wang, Pradyumna Narayana, Jesse Smith, Bruce Draper, Ross Beveridge, and Jaime Ruiz. 2018.In Proceedings of the 23rd International Conference on Intelligent User Interfaces (IUI '18). ACM, New York, NY, USA. DOI: https://doi.org/10.1145/3172944.3173003