List of Past Events
Interactions of bottom-up and top-down processes in visual perception
Dr. Thomas Papathomas
Thursday, December 04, 2014, 12:00pm - 07:00pm
Rutgers University, Department of Biomedical Engineering and Laboratory of VIsion Research
One of the oldest hypotheses in vision is that what we perceive does not depend only on the signals that come in through our eyes, but also on the state of the observer, experience-based stored knowledge (in the form of rules or of specific knowledge of objects), suggestions from others, etc. What we perceive is not a passive process, akin to a video camera, but it is the product of active processing of the data coming through our eyes. It appears that the visual brain makes hypotheses, based on the input signals, about what is out there and then tests these hypotheses against the incoming data. It is only when this interaction of the data processing mechanisms and top-down analysis converge that we achieve a stable percept. I will present evidence from studies in perception under altered states, in pathologies, bistable illusions, impoverished ambiguous stimuli and brain imaging that supports this hypothesis.