Evaluating Author and User Experience for an Audio-Haptic System for Annotation of Physical Models

TitleEvaluating Author and User Experience for an Audio-Haptic System for Annotation of Physical Models
Publication TypeConference Paper
Year of Publication2017
AuthorsCoughlan, J, Miele, J
Conference Name19th Int’l ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2017)
Date Published10/2017
PublisherACM
Conference LocationBaltimore, MD
Other NumbersPMCID: PMC5714613
Abstract

We describe three usability studies involving a prototype system for creation and haptic exploration of labeled locations on 3D objects. The system uses a computer, webcam, and fiducial markers to associate a physical 3D object in the camera’s view with a pre-defined digital map of labeled locations (“hotspots”), and to do real-time finger tracking, allowing a blind or visually impaired user to explore the object and hear individual labels spoken as each hotspot is touched. This paper describes: (a) a formative study with blind users exploring pre-annotated objects to assess system usability and accuracy; (b) a focus group of blind participants who used the system and, through structured and unstructured discussion, provided feedback on its  practicality, possible applications, and real-world potential; and (c) a formative study in which a sighted adult used the system to add labels to on-screen images of objects, demonstrating the practicality of remote annotation of 3D models. These studies and related literature suggest potential for future iterations of the system to benefit blind and visually impaired users in educational, professional, and recreational contexts.

Refereed DesignationRefereed

File

Related Centers, Labs, Projects