• Event Date: 2024-11-12
  • Event Start Time: 2:00 PM
  • Event End Time: 3:30 PM
  • Event Location: 152 Frelinghuysen Rd, Psych Bldg, Busch Campus, Room 105
  • Event Type: Talks: RuCCS Colloquia
  • Event Semester: Fall 2024

Abstract: Sounds and visual stimuli are localized and represented differently by the brain. In this talk, I’ll cover two topics: how eye movement signals are incorporated into auditory processing, and how more than one stimulus can be encoded at a time. Eye movements are relevant to visual-auditory integration because they shift the orientation of the retina with respect to the head/ears. We recently discovered that the brain sends signals regarding eye movements to the ears, causing oscillations of the eardrum and producing self-generated sounds that can be detected via earbud microphones (Gruters et al, PNAS 2018). These eye movement-related eardrum oscillations (EMREOs) likely constitute the first step of a coordinate transformation of auditory signals into common coordinates with the visual system (Lovich et al PNAS 2023). Coding of multiple stimuli is also essential, as the natural world is replete with multiple visual and auditory stimuli at any given moment. We recently proposed that multiple stimuli may be encoded via multiplexing, in which neural activity fluctuates across time to allow representations to encode more than one simultaneous visual or auditory stimulus (Caruso et al, Nat Comm 2018; Jun et al. eLife 2022, Schmehl et al. eLife 2024, Groh et al. TiCS 2024). These findings all emerged from experimentally testing computational models regarding spatial representations and their transformations within and across sensory pathways, highlighting the importance of theory in guiding experimental science.

 

Bio: Dr. Jennifer Groh