Scientific

Zoom Colloquium: New developments in measuring stereoscopic vision

Zoom Colloquium: New developments in measuring stereoscopic vision

Abstract – Recent data have demonstrated limits in the way clinical and psychophysical stereo tests measure stereopsis. Among others, we discovered a new family of non-stereoscopic cues and failures in the psychophysical methods. I will present these data and how to circumvent the issues with new methods and sampling algorithms. We developed three new stereo tests and tested their psychometric properties. Finally, with these tests we revisited the question of the link between stereoacuity and vergence abilities. http://www.aging-vision-action.fr/people/adrien-chopin/

Zoom Colloquium: Scaling Accessible Systems

Zoom Colloquium: Scaling Accessible Systems

Abstract -In this talk, I will discuss ongoing work in developing scalable tools for seamlessly interactive accessibility systems. The proposed data-driven tools enable more broadly applicable, ability-based and needs-aware interaction with diverse people across various platforms and novel scenarios. For example, consider an interactive system (e.g., a wearable system or mobile robot) for assisting individuals with visual impairments. Generally, effective operation by such systems heavily relies on careful design, iteration, and validation to consider the specific ability profile of an intended user. Yet, post-real-world deployment, the system may encounter broad and highly diverse interaction scenarios, including novel users and environmental contexts. In such concrete and potentially safety-critical scenarios, each user’s reactions can differ significantly based on various personal factors (type of visual impairment, preferences, background). Thus, manually designing interactive systems at scale can be quite challenging. Instead, I will present a more robust and effective approach for automatically adapting a user model on-the-fly by continually observing the user. I will demonstrate how the proposed approach can be understood quickly; modeling to a new user who has a handful of interactions to meet real-time needs flexibly (e.g., various skills, mobility aids, unseen conditions) when providing step-by-step indoor navigation assistance. To further facilitate scalability of accessible autonomous systems, I will present an extensive realistic accessibility-centered simulation environment. The environment aims to address inherent data scarcity (e.g., the rarity of pedestrians with disabilities in current datasets for autonomous driving applications) as well as prohibitive costs of accessibility studies. In particular, the introduced interactive environment enables the training of more robust, adaptive, and inclusive intelligent systems. Ultimately, the introduced fine-grained personalized interaction tools provide a shared framework for the development and deployment of accessible systems at scale. https://www.bu.edu/cs/profiles/eohnbar/

Zoom Colloquium: Extreme nature - Ocular motor control in Chameleons

Zoom Colloquium: Extreme nature – Ocular motor control in Chameleons

Abstract – Chameleons are considered a potentially important model for vision in non-mammalian vertebrates. They provide exceptional behavioral tools for studying eye movements as well as information gathering and analysis. They perform large-amplitude eye movements that are frequently referred to as independent, or disconjugate. Moreover, they have fully decussated optic nerves with intertectal connections that are not as developed as in mammals. Optical adaptations for arboreal life and insectivoury result in retinal image enlargement and the unique capacity to determine target distance by accommodation cues. However, the extent of the eyes’ independence is unclear. For example, can a chameleon visually track two targets simultaneously and monocularly, i.e. one with each eye? And what will be the ocular motor control pattern of the eyes? With their extreme visual capacities, chameleons open the field of lateralization, decision making, and context dependence visual behavior. They allow a deeper examination of the relationships between their unique visuo-motor capacities and the central nervous system of reptiles and ectotherms, in general, as compared with mammals.

Zoom Brown Bag: Noise in the Machine: Challenges and Potential Solutions for Wearable Eye Tracking, particularly in Individuals

Zoom Brown Bag: Noise in the Machine: Challenges and Potential Solutions for Wearable Eye Tracking, particularly in Individuals with Eccentric Fixation

Abstract: Developments in wearable eye tracking devices make them an attractive solution for studies of eye movements during naturalistic head/body motion. However, before these “systems” potential can be fully realized, a thorough assessment of potential sources of error is needed. First, I will discuss three possible sources of error for the Pupil Core eye-tracking goggles: camera motion during head/body motion, choice of calibration marker configuration, and eye movement estimation. Focusing on a clinical population, loss of the central retina, including the fovea, can lead to a loss of visual acuity and oculomotor deficits and thus have profound effects on day-to-day tasks. Recent advances in head-mounted, 3D eye tracking have allowed researchers to extend studies in this population to a broader set of daily tasks and more naturalistic behaviors and settings. However, decreases in fixational stability, multiple fixational loci, and their uncertain role as oculomotor references, as well as eccentric fixation all provide additional challenges for calibration and collection of eye movement data. I will show decreases in calibration accuracy relative to fixation eccentricity and discuss a robotic calibration and validation tool we are developing in the lab that may allow for future developments of calibration and tracking algorithms designed with this population in mind. https://ski.org/users/natela-shanidze  

Zoom Brown Bag: Neural representation of physical and perceived environmental acoustics

Zoom Brown Bag: Neural representation of physical and perceived environmental acoustics

Abstract – Most real-world hearing occurs in acoustically cluttered, reverberant environments, making perceptual segregation of sound sources critical. Reverberation signal carries environmental spatial information of potential use to blind and visually impaired persons. Understanding the neural mechanisms of auditory scene analysis can help identify points of failure in high-level hearing loss and guide behavioral or technological therapeutic interventions. During this talk, I will present some preliminary results of an EEG experiment describing the neural representation of statistical regularities of real-world reverberant environments by using Multivariate Pattern Analysis (MVPA). https://ski.org/users/haydee-garcia-lazaro

Zoom Brown Bag: A Friendly Introduction to Git and GitHub (for non-programmers… who have to program)

Zoom Brown Bag: A Friendly Introduction to Git and GitHub (for non-programmers… who have to program).

Abstract – Although programming may have once been a niche skill, the abundance of coding languages, environments, and libraries that are geared for practically any scientific pursuit has made programming an indispensable tool in any field. Being able to code one’s own software tools, whether to run or automate experiments, perform analysis, visualize results, etc., is an empowering and invaluable skill. For those who pick up coding outside of a CS department or later in their careers, they may not be aware of some of the common “best practices” in software development. Here will be presented an introduction to source control, and the specific web-based repository host for sharing and tracking code called GitHub. In addition to describing just what Git is, I’ll cover why it might be useful for you and/or your lab, and how to create and utilize a repository on GitHub.

Image of the first session of the meeting program

Society for Human Brain Mapping and Therapeutics Annual Conference

The 2021 Annual World Congress of SBMT brings together physicians, scientists, policy makers, funding agencies and industry to further the advances and applications in brain and spinal cord mapping and image guided therapies (operative and non-operative). A trade exhibition within the framework of the SBMT World Congress was held at the Convention Center, in Los Angeles, California on Thursday, July 8th to Sunday July 11th, 2021. The conference is designed to create a critical mass by introducing synergy amongst inter-disciplinary researchers to further understand the brain function and nervous system. It will serve as a platform from which to develop interactions between many of the stakeholders who also have extensive collaborations at national and international levels. The conference provides the opportunity to be at the forefront of brain sciences, therapeutics in general and neural stem cells interventions in particular. It provides a strong platform for industry and biotech companies to interact with academia in frontiers of science in this field for translational initiatives involving diverse patient’s interest groups.

Zoom Brown Bag: A New Model of Binocular Eye Movement Control

Zoom Brown Bag: A New Model of Binocular Eye Movement Control

Abstract – Despite that we have two eyes, dominant models of voluntary eye movement control except a single input, and issue a single conjugate command, in accordance with Hering’s Law. When midline gaze shifts are required, Hering postulated that another single command generates vergence. Many natural gaze shifts are not simply restricted to the frontoparallel plane, or the midline. Hering resolved this by postulating that conjugate and vergence commands simply sum. These “classic” theories have guided basic oculomotor research and strabismus intervention for over a century, but have recently been challenged with new data from an occluded eye (Chandna et al., 2021). I will present a new model of binocular control that can explain occluded eye data, and can generate other human oculomotor behavior that challenges existing models. The model consists of a fast, binocular conjugate component that is modulated by slow, independent controllers on each eye. The independent controllers can use interpreted sensory information to generate eye movements even when sensory evidence is incomplete. https://ski.org/users/steve-heinen

Zoom Brown Bag: Understanding Smooth Pursuit Deficits in Macular Degeneration

Zoom Brown Bag: Understanding Smooth Pursuit Deficits in Macular Degeneration

Abstract – Age-related macular degeneration (AMD) is the most prevalent cause of central visual field loss. Since the high-acuity fovea (the oculomotor locus) is often impaired in AMD, people afflicted with the condition typically have difficulties with smooth pursuit eye movements. In this brown-bag, I will detail two possible causes for deficits in smooth pursuit in AMD: 1) noisy sensory input due to objects disappearing into the scotoma, and 2) oculomotor instability due to the use of an eccentric preferred retinal locus (PRL). To address the first cause, we designed an experiment to categorize the noise introduced when the target of pursuit is hidden by the scotoma in AMD. We presented control participants with a Brownian motion display with an increased dot density region into which the target could disappear and compared pursuit during this disappearance with AMD participants. In addition to showing preliminary results, I present a plan to categorize the effects of oculomotor instability on the pursuit in AMD as well as a candidate model to integrate the two proposed causes of pursuit deficits in AMD, based on Bayesian frameworks recently used to describe predictive smooth pursuit in individuals with healthy vision. https://ski.org/users/jason-rubinstein

Eye movements as readout of predictive and decision processes

Zoom Colloquium: Eye movements as readout of predictive and decision processes

Abstract – Visually-guided orienting eye movements (saccades and smooth pursuit) are primarily determined by the physical properties of the target stimuli. However, cognitive factors, such as expectancy and motivation can significantly modulate this kind of eye movements. For instance, past work in our group has highlighted a robust parametric relation between the expected probability of visual motion direction and anticipatory smooth eye movements. I will present recent extensions of the analysis of predictive eye movements across development and across contexts of increasing complexity. In addition, I will illustrate the dynamic relation between visually-guided eye movements and visual perceptual decisions. In particular, I will describe the recently identified dissociation between the oculomotor and the perceptual bias induced by experience-based motion expectancy. Finally, I will discuss how these findings can be conciliated with the theoretical framework of Bayesian inference for sensorimotor integration and perception.