The Ruiz HCI Lab at UF main research agenda is to establish an understanding of users to develop technology to support user interaction. As such, we have a wide variety of current projects.

Embodied Virtual Agents

The research project aims to investigate the role and effectiveness of nonverbal communication, specifically gaze, body posture, and gesture, in embodied virtual agents. The study extends beyond traditional virtual environments by exploring the application of these nonverbal cues in augmented reality (AR) settings.

Electromyography (EMG)-based Assistive Virtual Reality Human-Machine Interface

In collaboration with Drs. Maryam Zahabi (Texas A&M) and David Kaber (UF), we developed a virtual reality human-machine interface (HMI) developed to evaluate the cognitive workload and compare performance metrics between these three control configurations using virtual dexterity exercises that reflect activities of daily living (ADLs).

ENKIx: Enabling Knowledgeable Task Guidance In the Extremes

The goal of this work is to augment human cognition by providing task guidance through augmented reality (AR) headset technology in extreme environments, including high hazard and risky operations.

KARGAMobile: Android app for portable, real-time, easily interpretable analysis of antibiotic resistance genes

The goal of this project is to aid the efforts of creating mobile bioinformatics methods and systems for on-site, real-time detection of pathogens and antimicrobial resistance by using nanopore technology.

Mobile Interfaces for Real-Time Surveillance of Antimicrobial Resistance

The goal of this project is to create mobile bioinformatics methods for on-site, real-time detection of AMR using Nanopore technology. The expected methods will work on-device, meaning they will only use the hardware (RAM, cache, hard disk, processors) on the portable device.

Multimodal Affective Recognition

Frustration can manifest through different channels, such as audiovisual cues, behavioral and physiological biometrics. This research project aims to detect and recognize frustration by drawing upon existing theoretical frameworks and leveraging multiple modalities. Additionally, the project seeks to explore the relationships between physiological signals, cognitive workload, and frustration levels.

Natural Multimodal Authentication in Smart Environments

In collaboration with the INIT Lab, this project focuses on comprehending user preferences, attitudes, expectations, and needs in relation to natural multimodal authentication within the context of smart environments.

Persuasive Interfaces for Health

This project aims to understand how to design persuasive interfaces that encourage users to adopt positive health behaviors.

Situational Awareness through Augmented Reality (AR)

Situational awareness can be defined as the detection and comprehension of elements in the environment, and maintaining situational awareness is crucial in safety critical domains. Poor situational awareness has led to aircraft crashes, oil spills, and medical errors. In our work, we focus on exploring the design of information in AR headsets in the context of increasing users’ situational awareness.

Bimanual Interaction for Tablet Computing

Prior to our work, interfaces that relied heavily on electronic stylus input often suffered from software state overload, leading to a cumbersome set of modes in the interface. Our research addressed this issue by introducing a technique called concurrent bimanual mode switching, which allows users to switch software states more efficiently by coordinating their non-preferred hand and preferred hand gestures. This technique reduces the temporal cost associated with switching states and enables interface designers to offer a wider range of options to users.

Communication Through Gestures, Expression and Shared Perception

The goal of this project is to understand how systems can model non-verbal communication as a means to mimic human-human communication.

Designing Natural User Interfaces for Children

In our work, we have examined children’s touch and gesture interactions on smartphones, tablets, and tabletops.

Gestures for Mobile Interaction

The Ruiz HCI Lab has had several projects that have examined the use of novel gesturing paradigms to interact with smartphones and smartwatches. This work includes our work using motion gestures (i.e., physically moving a device), gestures on the back of a device, and gestures around a device.

Large Publicly-Shared Interactive Displays

The goal of this research is to examine the use of large public-shared interactive displays and how mobile devices can enrich interaction around these displays.

Motion Kinematics in Interfaces

This project focuses on developing techniques to model motion kinematics within interfaces to predict motion endpoint.

Supporting E-Science Through HCI

We have established a research program that aims to support life science research that requires extensive use of computational tools (often referred to e-Science in the literature).

Tools for Gestural Analysis

In order to conduct research on gesture-based interactions, a focus of our work is on developing tools to support gestural analysis. This includes tools to assist in the annotation, analysis, and visualization of gesture data, with an emphasis on creating tools that automate part of the work and thus leverage both machine and human capabilities to complete tasks.