Interactiles

The absence of tactile cues such as keys and buttons makes touchscreens difficult to navigate for people with visual impairments. Increasing tactile feedback and tangible interaction on touchscreens can improve their accessibility. However, prior solutions have either required hardware customization or provided limited functionality with static overlays. In addition, the investigation of tactile solutions for large touchscreens may not address the challenges on mobile devices. We therefore present Interactiles, a low-cost, portable, and unpowered system that enhances tactile interaction on Android touchscreen phones. Interactiles consists of 3D-printed hardware interfaces and software that maps interaction with that hardware to manipulation of a mobile app. The system is compatible with the built-in screen reader without requiring modification of existing mobile apps. We describe the design and implementation of Interactiles, and we evaluate its improvement in task performance and the user experience it enables with people who are blind or have low vision.

XiaoyiZhang, TracyTran, YuqianSun, IanCulhane, ShobhitJain, JamesFogarty, JenniferMankoff: Interactiles: 3D Printed Tactile Interfaces to Enhance Mobile Touchscreen Accessibility. ASSETS 2018: To Appear [PDF]

Figure 2. Floating windows created for number pad (left), scrollbar (right) and control button (right bottom). The windows can be transparent; we use colors for demonstration.
Figure 4. Average task completion times of all tasks in the study.

Nonvisual Interaction Techniques at the Keyboard Surface

Rushil Khurana,Duncan McIsaac, Elliot Lockerman,Jennifer Mankoff Nonvisual Interaction Techniques at the Keyboard Surface, CHI 2018, To Appear

A table (shown on screen). Columns are mapped to the number row of the keyboard and rows to the leftmost column of keys, and (1) By default the top left cell is selected. (2) The right hand presses the ‘2’ key, selecting the second column (3) The left hand selects the next row (4) The left hand selects the third row. In each case, the position of the cell and its content are read out aloud.

Web user interfaces today leverage many common GUI design patterns, including navigation bars and menus (hierarchical structure), tabular content presentation, and scrolling. These visual-spatial cues enhance the interaction experience of sighted users. However, the linear nature of screen translation tools currently available to blind users make it difficult to understand or navigate these structures. We introduce Spatial Region Interaction Techniques (SPRITEs) for nonvisual access: a novel method for navigating two-dimensional structures using the keyboard surface. SPRITEs 1) preserve spatial layout, 2) enable bimanual interaction, and 3) improve the end user experience. We used a series of design probes to explore different methods for keyboard surface interaction. Our evaluation of SPRITEs shows that three times as many participants were able to complete spatial tasks with SPRITEs than with their preferred current technology.

Talk [Slides]:

Sample Press:

KOMO Radio | New screen reader method helps blind, low-vision users browse complex web pages

Device helps blind, low-vision users better browse web pages. Allen Cone

Graph showing task completion rates for different kinds of tasks in our user study
A user is searching a table (shown on screen) for the word ‘Jill’. Columns are mapped to the number row of the keyboard and rows to the leftmost column of keys. (1) By default the top left cell is selected. (2) The right hand presses the ‘2’ key, selecting the second column (3) The left hand selects the next row (4) The left hand selects the third row. In each case, the number of occurrences of the search query in the respective column or row are read aloud. When the query is found, the position and content of the cell are read out aloud.

The Tangible Desktop

Mark S. BaldwinGillian R. HayesOliver L. HaimsonJennifer MankoffScott E. Hudson: The Tangible Desktop: A Multimodal Approach to Nonvisual Computing. TACCESS 10(3): 9:1-9:28 (2017)

Audio-only interfaces, facilitated through text-to-speech screen reading software, have been the primary mode of computer interaction for blind and low-vision computer users for more than four decades. During this time, the advances that have made visual interfaces faster and easier to use, from direct manipulation to skeuomorphic design, have not been paralleled in nonvisual computing environments. The screen reader–dependent community is left with no alternatives to engage with our rapidly advancing technological infrastructure. In this article, we describe our efforts to understand the problems that exist with audio-only interfaces. Based on observing screen reader use for 4 months at a computer training school for blind and low-vision adults, we identify three problem areas within audio-only interfaces: ephemerality, linear interaction, and unidirectional communication. We then evaluated a multimodal approach to computer interaction called the Tangible Desktop that addresses these problems by moving semantic information from the auditory to the tactile channel. Our evaluation demonstrated that among novice screen reader users, Tangible Desktop improved task completion times by an average of 6 minutes when compared to traditional audio-only computer systems.

Also see: Mark S. BaldwinJennifer MankoffBonnie A. NardiGillian R. Hayes: An Activity Centered Approach to Nonvisual Computer Interaction. ACM Trans. Comput. Hum. Interact. 27(2): 12:1-12:27 (2020)

Making the field of computing more inclusive for people with disabilities

Lazar, J., Churchill, E. F., Grossman, T., Van der Veer, G., Palanque, P., Morris, J. S., & Mankoff, J. (2017). Making the field of computing more inclusiveCommunications of the ACM60(3), 50-59.

More accessible conferences, digital resources, and ACM SIGs will lead to greater participation by more people with disabilities. Improving conference and online material accessibility has been an ongoing project that I’ve been lucky enough to help with. This effort, led by a wide set of people, is spearheaded currently by the SIGCHI Accessibility Community (also on facebook, summarized in a recent Interactions blog post.

 

A Beam Robot Jen is using to attend a conference

Volunteer AT Fabricators

Perry-Hill, J., Shi, P., Mankoff, J. & Ashbrook, D. Understanding Volunteer AT Fabricators: Opportunities and Challenges in DIY-AT for Others in e-NABLE. Accepted to CHI 2017

We present the results of a study of e-NABLE, a distributed, collaborative volunteer effort to design and fabricate upper-limb assistive technology devices for limb-different users. Informed by interviews with 14 stakeholders in e-NABLE, including volunteers and clinicians, we discuss differences and synergies among each group with respect to motivations, skills, and perceptions of risks inherent in the project. We found that both groups are motivated to be involved in e-NABLE by the ability to use their skills to help others, and that their skill sets are complementary, but that their different perceptions of risk may result in uneven outcomes or missed expectations for end users. We offer four opportunities for design and technology to enhance the stakeholders’ abilities to work together.

Screen Shot 2017-03-14 at 1.09.13 PMA variety of 3D-printed upper-limb assistive technology devices designed and produced by volunteers in the e-NABLE community. Photos were taken by the fourth author in the e-NABLE lab on RIT’s campus.

Tactile Interfaces to Appliances

Anhong Guo, Jeeeun Kim, Xiang ‘Anthony’ Chen, Tom Yeh, Scott E. Hudson, Jennifer Mankoff, & Jeffrey P. Bigham, Facade: Auto-generating Tactile Interfaces to Appliances, In Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems (CHI’17), Denver, CO (To appear)

Common appliances have shifted toward flat interface panels, making them inaccessible to blind people. Although blind people can label appliances with Braille stickers, doing so generally requires sighted assistance to identify the original functions and apply the labels. We introduce Facade – a crowdsourced fabrication pipeline to help blind people independently make physical interfaces accessible by adding a 3D printed augmentation of tactile buttons overlaying the original panel. Facade users capture a photo of the appliance with a readily available fiducial marker (a dollar bill) for recovering size information. This image is sent to multiple crowd workers, who work in parallel to quickly label and describe elements of the interface. Facade then generates a 3D model for a layer of tactile and pressable buttons that fits over the original controls. Finally, a home 3D printer or commercial service fabricates the layer, which is then aligned and attached to the interface by the blind person. We demonstrate the viability of Facade in a study with 11 blind participants.

5792511475098337672(1)

Printable Adaptations

Reprise: A Design Tool for Specifying, Generating, and Customizing 3D Printable Adaptations on Everyday Objects

Reprise is a tool for creating custom adaptive 3D printable designs for making it easier to manipulate everything from tools to zipper pulls. Reprise’s library is based on a survey of about 3,000 assistive technology and life hacks drawn from textbooks on the topic as well as Thingiverse. Using Reprise, it is possible to specify a type of action (such as grasp or pull), indicate the direction of action on a 3D model of the object being adapted, parameterize the action in a simple GUI, specify an attachment method, and produce a 3D model that is ready to print.

Xiang ‘Anthony’ Chen, Jeeeun Kim, Jennifer Mankoff, Tovi Grossman, Stelian Coros, Scott Hudson (2016). Reprise: A Design Tool for Specifying, Generating, and Customizing 3D Printable Adaptations on Everyday Objects. Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology (UIST 2016) (pdf)

This slideshow requires JavaScript.

Helping Hands

Prosthetic limbs and assistive technology (AT) require customization and modification over time to effectively meet the needs of end users. Yet, this process is typically costly and, as a result, abandonment rates are very high. Rapid prototyping technologies such as 3D printing have begun to alleviate this issue by making it possible to inexpensively, and iteratively create general AT designs and prosthetics. However for effective use, technology must be applied using design methods that support physical rapid prototyping and can accommodate the unique needs of a specific user. While most research has focused on the tools for creating fitted assistive devices, we focus on the requirements of a design process that engages the user and designer in the rapid iterative prototyping of prosthetic devices.

We present a case study of three participants with upper-limb amputations working with researchers to design prosthetic devices for specific tasks. Kevin wanted to play the cello, Ellen wanted to ride a hand-cycle (a bicycle for people with lower limb mobility impairments), and Bret wanted to use a table knife. Our goal was to identify requirements for a design process that can engage the assistive technology user in rapidly prototyping assistive devices that fill needs not easily met by traditional assistive technology. Our study made use of 3D printing and other playful and practical prototyping materials. We discuss materials that support on-the-spot design and iteration, dimensions along which in-person iteration is most important (such as length and angle) and the value of a supportive social network for users who prototype their own assistive technology. From these findings we argue for the importance of extensions in supporting modularity, community engagement, and relatable prototyping materials in the iterative design of prosthetics

Prosthetic limbs and assistive technology (AT) require customization and modification over time to effectively meet the needs of end users. Yet, this process is typically costly and, as a result, abandonment rates are very high. Rapid prototyping technologies such as 3D printing have begun to alleviate this issue by making it possible to inexpensively, and iteratively create general AT designs and prosthetics. However for effective use, technology must be applied using design methods that support physical rapid prototyping and can accommodate the unique needs of a specific user. While most research has focused on the tools for creating fitted assistive devices, we focus on the requirements of a design process that engages the user and designer in the rapid iterative prototyping of prosthetic devices.

We present a case study of three participants with upper-limb amputations working with researchers to design prosthetic devices for specific tasks. Kevin wanted to play the cello, Ellen wanted to ride a hand-cycle (a bicycle for people with lower limb mobility impairments), and Bret wanted to use a table knife. Our goal was to identify requirements for a design process that can engage the assistive technology user in rapidly prototyping assistive devices that fill needs not easily met by traditional assistive technology. Our study made use of 3D printing and other playful and practical prototyping materials. We discuss materials that support on-the-spot design and iteration, dimensions along which in-person iteration is most important (such as length and angle) and the value of a supportive social network for users who prototype their own assistive technology. From these findings we argue for the importance of extensions in supporting modularity, community engagement, and relatable prototyping materials in the iterative design of prosthetics

Photos

Project Files

https://www.thingiverse.com/thing:2365703

Project Publications

Helping Hands: Requirements for a Prototyping Methodology for Upper-limb Prosthetics Users

Reference:

Megan Kelly Hofmann, Jeffery Harris, Scott E Hudson, Jennifer Mankoff. 2016.Helping Hands: Requirements for a Prototyping Methodology for Upper-limb Prosthetics Users. InProceedings of the 34th Annual ACM Conference on Human Factors in Computing Systems (CHI ’16). ACM, New York, NY, USA, 525-534.

Making Connections: Modular 3D Printing for Designing Assistive Attachments to Prosthetic Devices

Reference:

Megan Kelly Hofmann. 2015. Making Connections: Modular 3D Printing for Designing Assistive Attachments to Prosthetic Devices. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (ASSETS ’15). ACM, New York, NY, USA, 353-354. DOI=http://dx.doi.org/10.1145/2700648.2811323

Supporting Navigation in the Wild for the Blind

uncovering_thumbnailSighted individuals often develop significant knowledge about their environment through what they can visually observe. In contrast, individuals who are visually impaired mostly acquire such knowledge about their environment through information that is explicitly related to them. Our work examines the practices that visually impaired individuals use to learn about their environments and the associated challenges. In the first of our two studies, we uncover four types of information needed to master and navigate the environment. We detail how individuals’ context impacts their ability to learn this information, and outline requirements for independent spatial learning. In a second study, we explore how individuals learn about places and activities in their environment. Our findings show that users not only learn information to satisfy their immediate needs, but also to enable future opportunities – something existing technologies do not fully support. From these findings, we discuss future research and design opportunities to assist the visually impaired in independent spatial learning.

Uncovering information needs for independent spatial learning for users who are visually impaired. Nikola Banovic, Rachel L. Franz, Khai N. Truong, Jennifer Mankoff, and Anind K. DeyIn Proceedings of the 15th international ACM SIGACCESS conference on Computers and accessibility (ASSETS ’13). ACM, New York, NY, USA, Article 24, 8 pages. (pdf)

3D Printed Prosthetics: Case Study

Readings

  • Megan Hofmann, Julie Burke, Jon Pearlman, Goeran Fiedler, Andrea Hess, Jon Schull, Scott E. Hudson, Jennifer Mankoff: Clinical and Maker Perspectives on the Design of Assistive Technology with Rapid Prototyping Technologies. ASSETS 2016: 251-256
  • Cynthia L. Bennett, Keting Cen, Katherine Muterspaugh Steele, Daniela K. Rosner: An Intimate Laboratory?: Prostheses as a Tool for Experimenting with Identity and Normalcy. CHI 2016: 1745-1756

Optional: 

  • Jeremiah Parry-Hill, Patrick C. Shih, Jennifer Mankoff, Daniel Ashbrook: Understanding Volunteer AT Fabricators: Opportunities and Challenges in DIY-AT for Others in e-NABLE. CHI 2017: 6184-6194