Interaction via Wireless Earbuds

Xuhai XuHaitian ShiXin YiWenjia LiuYukang YanYuanchun ShiAlex Mariakakis, Jennifer Mankoff, Anind K. Dey:
EarBuddy: Enabling On-Face Interaction via Wireless Earbuds. CHI 2020: 1-14

Past research regarding on-body interaction typically requires custom sensors, limiting their scalability and generalizability. We propose EarBuddy, a real-time system that leverages the microphone in commercial wireless earbuds to detect tapping and sliding gestures near the face and ears. We develop a design space to generate 27 valid gestures and conducted a user study (N=16) to select the eight gestures that were optimal for both human preference and microphone detectability. We collected a dataset on those eight gestures (N=20) and trained deep learning models for gesture detection and classification. Our optimized classifier achieved an accuracy of 95.3%. Finally, we conducted a user study (N=12) to evaluate EarBuddy’s usability. Our results show that EarBuddy can facilitate novel interaction and that users feel very positively about the system. EarBuddy provides a new eyes-free, socially acceptable input method that is compatible with commercial wireless earbuds and has the potential for scalability and generalizability

Yuna Liu

Yuna Liu is a second-year undergraduate majoring in Mathematics and Applied Mathematics. She is interested in simulation and mathematical modelling, and hopes to go to graduate school to study related fields. Yuna is currently on a UW EXP project that focuses on systematic review about the generalizability of passive sensing for health & well-being.

Brian Lee

My name is Brian Lee and I am a Junior at the University of Washington studying computer science.
I am passionate about human computer interaction and accessibility in technology, and I am learning to build applications that can have an impact on everyone, not just a select few.
Currently, I am working with Kelly on the Sensing project, building a Samsung SmartWatch and Android phone app to allow people with chronic illnesses to tag and track sensor data throughout their day.

Aadi Jain

I am an avid software enthusiast with keen interest and experience in a wide array of software domains ranging from full stack to low level embedded programming. Currently, a Junior here at the Paul G Allen Institute at UW pursuing Computer Science. I am working on the Sensing App under the supervision of Kelly Mack at the Make4All lab.

Simona Liao

Simona is a sophomore at UW majoring in Computer Science and minoring in Gender, Women, Sexuality Studies. As an interdisciplinary student, she is passionate about applying technical skills to create a more equitable society. Currently, Simona is working on the UW EXP Study, which aimed to improve the well-being of Engineering students and process the EMA data collected from surveys. Simona is actively involved in leadership roles in the Society of Women Engineers at UW and Minorities in Tech in the Allen School.

HulaMove: Waist Interaction

Xuhai XuJiahao LiTianyi YuanLiang HeXin LiuYukang YanYuntao WangYuanchun Shi, Jennifer Mankoff, Anind K. Dey:
HulaMove: Using Commodity IMU for Waist Interaction. CHI 2021: 503:1-503:16

We present HulaMove, a novel interaction technique that leverages the movement of the waist as a new eyes-free and hands-free input method for both the physical world and the virtual world. We first conducted a user study (N=12) to understand users’ ability to control their waist. We found that users could easily discriminate eight shifting directions and two rotating orientations, and quickly confirm actions by returning to the original position (quick return). We developed a design space with eight gestures for waist interaction based on the results and implemented an IMU-based real-time system. Using a hierarchical machine learning model, our system could recognize waist gestures at an accuracy of 97.5%. Finally, we conducted a second user study (N=12) for usability testing in both real-world scenarios and virtual reality settings. Our usability study indicated that HulaMove significantly reduced interaction time by 41.8% compared to a touch screen method, and greatly improved users’ sense of presence in the virtual world. This novel technique provides an additional input method when users’ eyes or hands are busy, accelerates users’ daily operations, and augments their immersive experience in the virtual world.

Understanding Disabled Knitters


Taylor Gotfrid
Kelly MackKathryn J. LumEvelyn YangJessica K. HodginsScott E. Hudson, Jennifer Mankoff: Stitching Together the Experiences of Disabled Knitters. CHI 2021: 488:1-488:14

Knitting is a popular craft that can be used to create customized fabric objects such as household items, clothing and toys. Additionally, many knitters find knitting to be a relaxing and calming exercise. Little is known about how disabled knitters use and benefit from knitting, and what accessibility solutions and challenges they create and encounter. We conducted interviews with 16 experienced, disabled knitters and analyzed 20 threads from six forums that discussed accessible knitting to identify how and why disabled knitters knit, and what accessibility concerns remain. We additionally conducted an iterative design case study developing knitting tools for a knitter who found existing solutions insufficient. Our innovations improved the range of stitches she could produce. We conclude by arguing for the importance of improving tools for both pattern generation and modification as well as adaptations or modifications to existing tools such as looms to make it easier to track progress

Olivia Figueira

Olivia is a student at Santa Clara University pursuing a BS in Computer Science and Engineering with minors in Mathematics and Economics, and will be graduating in June, 2021. In the summer of 2019, she participated in CRA-WP’s Distributed Research Experience for Undergraduates (DREU) in the Make4All group with Jennifer Mankoff. She worked closely with Yasaman Sefidgar and Han Zhang to investigate the contribution of correlated stressors on mental health in college students leveraging actively-reported data from surveys and passively-sensed data from phones and wearables from the UWEXP study. She hopes to pursue a PhD in Computer Science and explore the field of human-computer interaction further.

Aashaka Desai

Aashaka is a PhD candidate in the UW Paul G. Allen School of Computer Science and Engineering. She is advised by Dr. Jennifer Mankoff and Dr. Richard Ladner. Her research focuses on d/Deaf and hard-of-hearing communication accessibility and explores how can we support all ways of communicating. She explores a range of modalities (speechreading, signing, captioning) as well as languages (multilingualism) in my work. She aims to both document the fluidity of language/communication as well as build technologies that support minoritized communication practices.

You can read more about Aashaka’s research at https://aashakadesai.github.io/