Hybrid Colloquium: Developing Disability-First Datasets for Non-Visual Information Access

Hybrid Colloquium: Developing Disability-First Datasets for Non-Visual Information Access

Past Event Date: 

Speaker: 

Abigale Stangl is a research scientist working at the intersection of human-computer interaction, accessibility, creativity, privacy, and computer vision

Host: 

Senior Scientist - Coughlan Lab Director, James Coughlan

Meeting room: 

Room 204 - Main Conference Room

Event Category: 

Event Type: 

Abstract:
Image descriptions, audio descriptions, and tactile media provide non-visual access to the information contained in visual media. As intelligent systems are increasingly developed to provide non-visual access, questions about the accuracy of these systems arise. In this talk, I will present my efforts to involve people who are blind in the development of information taxonomies and annotated datasets towards more accurate and context-aware visual assistance technologies and tactile media interfaces. https://abigalestangl.com/ 
 
Improving Zoom accessibility for people with hearing impairments 
People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)