Arboretum interface showing three accessible binary tree representations: a tabular node summary, a tactile braille-style tree, and a navigable visual tree, along with an automatically generated text description.

Nonvisual Support for Understanding and Reasoning about Data Structures

Wimer, B.*, Kanchi, R.*, Frierson, K., Potluri, V., Metoyer, R., Mankoff, J., Natsuhara, M., & Wang, M. (2026). Nonvisual Support for Understanding and Reasoning about Data Structures. In Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems (CHI ’26), April 13–17

Blind and visually impaired (BVI) computer science students face systematic barriers when learning data structures: current accessibility approaches typically translate diagrams into alternative text, focusing on visual appearance rather than preserving the underlying structure essential for conceptual understanding. More accessible alternatives often do not scale in complexity, cost to produce, or both. Motivated by a recent shift to tools for creating visual diagrams from code, we propose a solution that automatically creates accessible representations from structural information about diagrams. Based on a Wizard-of-Oz study, we derive design requirements for an automated system, Arboretum, that compiles text-based diagram specifications into three synchronized nonvisual formats—tabular, navigable, and tactile. Our evaluation with BVI users highlights the strength of tactile graphics for com- plex tasks such as binary search; the benefits of offering multiple, complementary nonvisual representations; and limitations of exist- ing digital navigation patterns for structural reasoning. This work reframes access to data structures by preserving their structural properties. The solution is a practical system to advance accessible CS education.

Leave a Reply

Your email address will not be published. Required fields are marked *