Designing Natural User Interfaces for Children

In collaboration with Dr. Lisa Anthony's INIT Lab, this work explores the challenges children have interacting with Natural User Interfaces (NUIs), i.e., interfaces that use touch, speech, and gesture. Children are increasingly being presented with technology, such as tablets and smartphones, in classrooms, museums, and at home. However, children are still developing their cognitive and physical capabilities, and prior work has shown that children’s interaction behaviors and expectations are different from those of adults. For example, our prior work has shown that gesture recognition algorithms do not perform as well for children, mainly due to their motor skills development affecting their writing ability. Our goal is to build better technology and interactions for children by understanding how children interactions differ from adults.

In our work, we have examined children’s touch and gesture interactions on smartphones, tablets, and tabletops. On the tabletop, children responded more accurately to changing target locations and were more accurate touching targets around the screen. We also found that the gesture recognition rates were consistent across devices, which implies that gesture data does not have to be collected on the device that recognition will take place. Throughout our work, we provide design guidelines for children’s touchscreen interactions to inform the design of touchscreen applications for children. While our work has mainly focused on improving recognition and accuracy, we also examined children’s expectations of intelligent user interfaces that use modalities such as speech and writing. We developed a conceptual model of the children’s expectations which can be used along with our prior work on improving accuracy to further develop technology that is tailored towards children. We continue to build upon our work to further examine children’s expectations and abilities in respect to Natural User Interfaces.