Generative Artificial Intelligence’s Utility for Accessibility

With the recent rapid rise in Generative Artificial Intelligence (GAI) tools, it is imperative that we understand their impact on people with disabilities, both positive and negative. However, although we know that AI in general poses both risks and opportunities for people with disabilities, little is known specifically about GAI in particular.

To address this, we conducted a three-month autoethnography of our use of GAI to meet personal and professional needs as a team of researchers with and without disabilities. Our findings demonstrate a wide variety of potential accessibility-related uses for GAI while also highlighting concerns around verifiability, training data, ableism, and false promises.

Glazko, K. S., Yamagami, M., Desai, A., Mack, K. A., Potluri, V., Xu, X., & Mankoff, J. An Autoethnographic Case Study of Generative Artificial Intelligence’s Utility for Accessibility. ASSETS 2023. https://dl.acm.org/doi/abs/10.1145/3597638.3614548

News: Can AI help boost accessibility? These researchers tested it for themselves

Presentation (starts at about 20mins)

https://youtube.com/watch?v=S40-jPBH820%3Fsi%3DCm17oTaMaDnoQGvK%3F%23t%3D20m26s

How Do People with Limited Movement Personalize Upper-Body Gestures?

Personalized upper-body gestures that can enable input from diverse body parts (e.g., head, neck, shoulders, arms, hands, and fingers), and match the abilities of each user, might make gesture systems more accessible for people with upper-body motor disabilities. Static gesture sets that make ability assumptions about the user (e.g., touch thumb and index finger together in midair) may not be accessible. In our work, we characterize the personalized gesture sets designed by 25 participants with upper-body motor disabilities. We found that the personalized gesture sets that participants designed were specific to their abilities and needs. Six participants mentioned that their inspiration for designing the gestures was based on “how I would do [the gesture] with the abilities that I have”. We suggest three considerations when designing accessible upper-body gesture interfaces: 

1) Track the whole upper body. Our participants used their whole upper-body to perform the gestures, and some switched back and forth from the left to the right hand to combat fatigue.

2) Use sensing mechanisms that are agnostic to the location and orientation of the body. About half of our participants kept their hand on or barely took their hand off of the armrest to decrease arm movement and fatigue.

3) Use sensors that can sense muscle activations without movement. Our participants activated their muscles but did not visibly move in 10% of the personalized gestures.   

Our work highlights the need for personalized upper-body gesture interfaces supported by multimodal biosignal sensors (e.g., accelerometers, sensors that can sense muscle activity like EMG). 

Race, Disability and Accessibility Technology

Working at the Intersection of Race, Disability, and Accessibility

Examinations of intersectionality and identity dimensions in accessibility research have primarily considered disability separately from a person’s race and ethnicity. Accessibility work often does not include considerations of race as a construct, or treats race as a shallow demographic variable, if race is mentioned at all. The lack of attention to race as a construct in accessibility research presents an oversight in our field, often systematically eliminating whole areas of need and vital perspectives from the work we do. Further, there has been little focus on the intersection of race and disability within accessibility research, and the relevance of their interplay. When research in race or disability does not mention the other, this work overlooks the potential to better understand the full nuance of marginalized and “otherized” groups. To address this gap, we present a series of case studies exploring the potential for research that lies at the intersection of race and disability. We provide examples of how to integrate racial equity perspectives into accessibility research, through positive examples found in these case studies and reflect on teaching at the intersection of race, disability, and technology. This paper highlights the value of considering how constructs of race and disability work alongside each other within accessibility research studies, designs of socio-technical systems, and education. Our analysis provides recommendations towards establishing this research direction.

Christina N. HarringtonAashaka DesaiAaleyah LewisSanika MoharanaAnne Spencer Ross, Jennifer Mankoff: Working at the Intersection of Race, Disability and Accessibility. ASSETS 2023: 26:1-26:18 (pdf)

https://youtube.com/watch?v=qRMYjdSTnZs%3Fsi%3D0yhLkUyGKu-WO4Na

Azimuth: Designing Accessible Dashboards for Screen Reader Users

Dashboards are frequently used to monitor and share data across a breadth of domains including business, finance, sports, public policy, and healthcare, just to name a few. The combination of different components (e.g., key performance indicators, charts, filtering widgets) and the interactivity between components makes dashboards powerful interfaces for data monitoring and analysis. However, these very characteristics also often make dashboards inaccessible to blind and low vision (BLV) users. Through a co-design study with two screen reader users, we investigate challenges faced by BLV users and identify design goals to support effective screen reader-based interactions with dashboards. Operationalizing the findings from the co-design process, we present a prototype system, Azimuth, that generates dashboards optimized for screen reader-based navigation along with complementary descriptions to support dashboard comprehension and interaction. Based on a follow-up study with five BLV participants, we showcase how our generated dashboards support BLV users and enable them to perform both targeted and open-ended analysis. Reflecting on our design process and study feedback, we discuss opportunities for future work on supporting interactive data analysis, understanding dashboard accessibility at scale, and investigating alternative devices and modalities for designing accessible visualization dashboards.

Arjun Srinivasan, Tim Harshbarger, Darrell Hilliker and Jennifer Mankoff: University of Washington (2023): “Azimuth: Designing Accessible Dashboards for Screen Reader Users” ASSETS 2023.

The Role of Speechreading in Online d/DHH Communication Accessibility

Speechreading is the art of using visual and contextual cues in the environment to support listening. Often used by d/Deaf and Hard-of-Hearing (d/DHH) individuals, it highlights nuances of rich communication. However, lived experiences of speechreaders are underdocumented in the literature, and the impact of online environment and interaction of captioning with speechreading has not been explored. To bridge these gaps, we conducted a three-part study consisting of formative interviews, design probes and design sessions with 12 d/DHH individuals who speechread.

Making a Medical Maker’s Playbook: An Ethnographic Study of Safety-Critical Collective Design by Makers in Response to COVID-19

Megan Hofmann, Udaya Lakshmi, Kelly Mack, Rosa I. Arriaga, Scott E. Hudson, and Jennifer Mankoff. Making a Medical Maker’s Playbook: An Ethnographic Study of Safety-Critical Collective Design by Makers in Response to COVID-19. Proc. ACM Hum. Comput. Interact. 6(CSCW1): 101:1-101:26 (2022).

We present an ethnographic study of a maker community that conducted safety-driven medical making to deliver over 80,000 devices for use at medical facilities in response to the COVID-19 pandemic. To achieve this, the community had to balance their clinical value of safety with the maker value of broadened participation in design and production. We analyse their struggles and achievement through the artifacts they produced and the labors of key facilitators between diverse community members. Based on this analysis we provide insights into how medical maker communities, which are necessarily risk-averse and safety-oriented, can still support makers’ grassroots efforts to care for their communities. Based on these findings, we recommend that design tools enable adaptation to a wider set of domains, rather than exclusively presenting information relevant to manufacturing. Further, we call for future work on the portability of designs across different types of printers which could enable broader participation in future maker efforts at this scale.

PSST: Enabling Blind or Visually Impaired Developers to Author Sonifications of Streaming Sensor Data

Venkatesh Potluri, John Thompson, James Devine, Bongshin Lee, Nora Morsi, Peli De Halleux, Steve Hodges, and Jennifer Mankoff. 2022. PSST: Enabling Blind or Visually Impaired Developers to Author Sonifications of Streaming Sensor Data. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology (UIST ’22). Association for Computing Machinery, New York, NY, USA, Article 46, 1–13. https://doi.org/10.1145/3526113.3545700

We present the first toolkit that equips blind and visually impaired (BVI) developers with the tools to create accessible data displays. Called PSST (Physical Computing Streaming Sensor data Toolkit), it enables BVI developers to understand the data generated by sensors from a mouse to a micro: bit physical computing platform. By assuming visual abilities, earlier efforts to make physical computing accessible fail to address the need for BVI developers to access sensor data. PSST enables BVI developers to understand real-time, real-world sensor data by providing control over what should be displayed, as well as when to display and how to display sensor data. PSST supports filtering based on raw or calculated values, highlighting, and transformation of data. Output formats include tonal sonification, nonspeech audio files, speech, and SVGs for laser cutting. We validate PSST through a series of demonstrations and a user study with BVI developers.

The demo video can be found here: https://youtu.be/UDIl9krawxg.

Chronically Under-Addressed: Considerations for HCI Accessibility Practice with Chronically III People

Accessible design and technology could support the large and growing group of people with chronic illnesses. However, human computer interactions(HCI) has largely approached people with chronic illnesses through a lens of medical tracking or treatment rather than accessibility. We describe and demonstrate a framework for designing technology in ways that center the chronically ill experience. First, we identify guiding tenets: 1) treating chronically ill people not as patients but as people with access needs and expertise, 2) recognizing the way that variable ability shapes accessibility considerations, and 3) adopting a theoretical understanding of chronic illness that attends to the body. We then illustrate these tenets through autoethnographic case studies of two chronically ill authors using technology. Finally, we discuss implications for technology design, including designing for consequence-based accessibility, considering how to engage care communities, and how HCI research can engage chronically ill participants in research.

Kelly Mack*, Emma J. McDonnell*, Leah Findlater, and Heather D. Evans. In The 24th International ACM SIGACCESS Conference on Computers and Accessibility.

COVID-19 and Remote Learning for Students with Disabilities

Han Zhang, Margaret E. Morris, Paula S. Nurius, Kelly Mack, Jennifer Brown, Kevin S. Kuehn, Yasaman S. Sefidgar, Xuhai Xu, Eve A. Riskin, Anind K. Dey and Jennifer Mankoff. Impact of Online Learning in the Context of COVID-19 on Undergraduates with Disabilities and Mental Health Concerns. ACM Transactions on Accessible Computing (TACCESS).

The COVID-19 pandemic upended college education and the experiences of students due to the rapid and uneven shift to online learning. This study examined the experiences of students with disabilities with online learning, with a consideration of surrounding stressors such as financial pressures. In a mixed method approach, we compared 28 undergraduate students with disabilities(including mental health concerns) to their peers during 2020, to assess differences and similarities in their educational concerns, stress levels and COVID-19 related adversities. We found that students with disabilities entered the Spring quarter of 2020 with significantly higher concerns about classes going online, and reported more recent negative life events than other students. These differences between the two groups diminished three months later with the exception of recent negative life events. For a fuller understanding of students’ experiences, we conducted qualitative analysis of open ended interviews. We examined both positive and negative experiences with online learning among students with disabilities and mental health concerns. Online learning enabled greater access – e.g., reducing the need for travel to campus–alongside ways in which online learning impeded academic engagement–e.g., reducing interpersonal interaction. Learning systems need to continue to meet the diverse and dynamic needs of students with disabilities.

Maptimizer

Megan HofmannKelly MackJessica BirchfieldJerry CaoAutumn G. HughesShriya KurpadKathryn J. LumEmily WarnockAnat CaspiScott E. Hudson, Jennifer Mankoff:
Maptimizer: Using Optimization to Tailor Tactile Maps to Users Needs. CHI 2022: 592:1-592:15 [pdf]

Tactile maps can help people who are blind or have low vision navigate and familiarize themselves with unfamiliar locations. Ideally, tactile maps are created by considering an individual’s unique needs and abilities because of their limited space for representation. However, significant customization is not supported by existing tools for generating tactile maps. We present the Maptimizer system which generates tactile maps that are customized to a user’s preferences and requirements, while making simplified and easy to read tactile maps. Maptimizer uses a two stage optimization process to pair representations with geographic information and tune those representations to present that information more clearly. In a user study with six blind/low-vision participants, Maptimizer helped participants more successfully and efficiently identify locations of interest in unknown areas. These results demonstrate the utility of optimization techniques and generative design in complex accessibility domains that require significant customization by the end user.

A system diagram showing the maptimizer data flow setup. The inputs are geography sets, representations options, and user preferences. Geography types and representation options are paired and tuned using an optimizer. The output is a tactile map.