Nonvisual Support for Understanding and Reasoning about Data Structures
Published in Proceedings of the CHI Conference on Human Factors in Computing Systems, 2026
Blind and visually impaired (BVI) computer science students face systematic barriers when learning data structures: current accessibility approaches typically translate diagrams into alternative text, focusing on visual appearance rather than preserving the underlying structure essential for conceptual understanding. More accessible alternatives often do not scale in complexity, cost to produce, or both. Motivated by a recent shift to tools for creating visual diagrams from code, we propose a solution that automatically creates accessible representations from structural information about diagrams. Based on a Wizard-of-Oz study, we derive design requirements for an automated system, Arboretum, that compiles text-based diagram specifications into three synchronized nonvisual formats—tabular, navigable, and tactile. Our evaluation with BVI users highlights the strength of tactile graphics for com- plex tasks such as binary search; the benefits of offering multiple, complementary nonvisual representations; and limitations of exist- ing digital navigation patterns for structural reasoning. This work reframes access to data structures by preserving their structural properties. The solution is a practical system to advance accessible CS education.
Reccomended Citation: Wimer, B., Kanchi, R., Frierson, K., Potluri, V., Metoyer, R., Mankoff, J., Natsuhara, M., & Wang, M. (2026). Nonvisual Support for Understanding and Reasoning about Data Structures. In Proceedings of the CHI Conference on Human Factors in Computing Systems.
[Download PDF]