Supporting Spatial Literacy for Blind Learners using Haptic Technology
Room 204 - Main Conference Room
The increasing requirement for spatial literacy, the ability to read graphs and charts and to interact with dynamic models that is driven by rapid advances in techniques for visualizing data is changing how students are required to learn. Consequently, educational curricula are being redesigned to address this need. Without a concerted effort on the part of educators and designers of adaptive technologies, blind people could potentially be ruled out of many jobs and professions simply because of the assumption that spatial representations must be 'visual representations.'
As our first steps toward this goal, we need to better understand both the technologies that are potentially available for rendering images through touch and also how to design tactile images to convey information in the most optimal way.
In this talk, I will try to address both of these issues. I will first provide an overview of haptic technologies and tactile materials for blind users currently being developed and assess the potential of each. I will then review the process by which visual images are turned into hard-copy tactile images which draw on the help of tactile image designer.