Accessible Point-and-Tap Interaction for Acquiring Detailed Information about Tactile Graphics and 3D Models

Andrea Narcisi Research Scholar

Event Date

Wednesday, March 6th, 2024 – 12:00pm to 1:00pm


Andrea Narcisi Research Scholar


I will present a system based on an iPhone app developed in the Coughlan Lab. This system is a novel “Point-and-Tap” interface that enables people who are blind or visually impaired (BVI) to easily acquire multiple levels of information about tactile graphics and 3D models. The interface uses an iPhone’s depth and color cameras to track the user’s hands while they interact with a model. To get basic information about a feature of interest on the model read aloud, the user points to the feature with their index finger. For additional information, the user lifts their index finger and taps the feature again. This process can be repeated multiple times to access additional levels of information. No audio labels are triggered unless the user makes a pointing gesture, which allows the user to explore the model freely with one or both hands. In addition, multiple taps can be issued in rapid succession to skip through to the desired information (an utterance in progress is halted whenever the fingertip is lifted off the feature), which is much faster than having to listen to all levels of information being played aloud in succession to reach the desired level. Experiments with BVI participants demonstrate that the approach is practical, easy to learn, and effective.

Event Category

Event Type