Hybrid Colloquium: Harnessing the Computer Science-Vision Science Symbiosis
Upcoming Event Date:
Speaker:Yuhao Zhuis a Assistant Professor of Computer Science at the University of Rochester
Host:Dr. Christopher Tyler
Meeting room:Room 204 - Main Conference Room
Emerging platforms such as Augmented Reality (AR), Virtual Reality (VR), and autonomous machines all intimately interact with humans. They must be built from the ground up, with principled considerations of human perception. This talk will discuss some of our recent work on exploiting the symbiosis between Computer Science and Vision Science.
We will discuss how to jointly optimize imaging, computing, and human perception to obtain unprecedented efficiency. The overarching theme is that a computing problem that seems challenging may become significantly easier when one considers how computing interacts with imaging and human perception in an end-to-end system. In particular, we will discuss two specific projects: real-time eye tracking for AR/VR and power optimization for VR displays. If time permits, I will also briefly discuss our ongoing work using computational techniques to help dichromats regain the trichromatic vision.
Improving Zoom accessibility for people with hearing impairments
People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)