14 drawings of participant gestures organized as 2 rows and 7 columns. The top row represents high agreement gestures, and the bottom row represents unique gestures that participants designed. The columns are functions (from left to right): move/pan, select, rotate, delete, open/close, zoom-in/-out, duplicate. For move/pan, the high agreement gesture was moving hand from left to right in mid-air and the unique gesture was flexing fingers from left to right. For select, the high agreement gesture was pointing with hand or finger, and the unique gesture was squeezing hand into fist. For rotate, the high agreement gesture was grabbing object and rotating about the person's wrist and the unique gesture was rotating shoulders. For delete, the high agreement gesture was swiping away and the unique gesture was looking down. For open/close, the high agreement gesture was swiping down, and the unique gesture was shrugging shoulders. For zoom-in/-out, the high agreement gesture was moving both hands towards and away from body, and the unique gesture was flexing shoulders in and out. For duplicate, the high agreement gesture was tapping twice in mid-air in two separate locations with the hand and the unique gesture was tapping chin twice.

How Do People with Limited Movement Personalize Upper-Body Gestures?

Personalized upper-body gestures that can enable input from diverse body parts (e.g., head, neck, shoulders, arms, hands, and fingers), and match the abilities of each user, might make gesture systems more accessible for people with upper-body motor disabilities. Static gesture sets that make ability assumptions about the user (e.g., touch thumb and index finger together in midair) may not be accessible. In our work, we characterize the personalized gesture sets designed by 25 participants with upper-body motor disabilities. We found that the personalized gesture sets that participants designed were specific to their abilities and needs. Six participants mentioned that their inspiration for designing the gestures was based on “how I would do [the gesture] with the abilities that I have”. We suggest three considerations when designing accessible upper-body gesture interfaces: 

1) Track the whole upper body. Our participants used their whole upper-body to perform the gestures, and some switched back and forth from the left to the right hand to combat fatigue.

2) Use sensing mechanisms that are agnostic to the location and orientation of the body. About half of our participants kept their hand on or barely took their hand off of the armrest to decrease arm movement and fatigue.

3) Use sensors that can sense muscle activations without movement. Our participants activated their muscles but did not visibly move in 10% of the personalized gestures.   

Our work highlights the need for personalized upper-body gesture interfaces supported by multimodal biosignal sensors (e.g., accelerometers, sensors that can sense muscle activity like EMG). 

Leave a Reply

Your email address will not be published. Required fields are marked *

50 − = 42