Personalized upper-body gestures that can enable input from diverse body parts (e.g., head, neck, shoulders, arms, hands, and fingers), and match the abilities of each user, might make gesture systems more accessible for people with upper-body motor disabilities. Static gesture sets that make ability assumptions about the user (e.g., touch thumb and index finger together in midair) may not be accessible. In our work, we characterize the personalized gesture sets designed by 25 participants with upper-body motor disabilities. We found that the personalized gesture sets that participants designed were specific to their abilities and needs. Six participants mentioned that their inspiration for designing the gestures was based on “how I would do [the gesture] with the abilities that I have”. We suggest three considerations when designing accessible upper-body gesture interfaces:
1) Track the whole upper body. Our participants used their whole upper-body to perform the gestures, and some switched back and forth from the left to the right hand to combat fatigue.
2) Use sensing mechanisms that are agnostic to the location and orientation of the body. About half of our participants kept their hand on or barely took their hand off of the armrest to decrease arm movement and fatigue.
3) Use sensors that can sense muscle activations without movement. Our participants activated their muscles but did not visibly move in 10% of the personalized gestures.
Our work highlights the need for personalized upper-body gesture interfaces supported by multimodal biosignal sensors (e.g., accelerometers, sensors that can sense muscle activity like EMG).