Videos footage from RuCCS Colloquium Talks can be found on the RuCCS YouTube Channel. For all other events, please check the sponsor's website for more detail.
To filter by event category, click on the event category link in the table below or use the menu on the right.
List of Past Events
A Model of Top-Down Control of Attention during Visual Search in Real-World Scenes
Monday, March 31, 2008, 01:00pm - 02:00pm
University of Massachusetts at Boston, Department of Computer Science
Recently, there has been great interest among vision researchers in devising computational models that predict the distribution ofsaccadic endpoints in naturalistic scenes. In these studies, subjects are instructed to view the scenes without any particular task in mindso that stimulus-driven (bottom-up) processes are assumed to guide visual attention. However, whenever there is a task, additionalgoal-driven (top-down) processes play an important � and most often dominant - role. We demonstrated that during visual search inreal-world scenes, attention is systematically biased towards low-level image features that resemble those of the search target.Based on these findings, we devised a computational model of top-down attentional control following three basic principles: First, visualsimilarity between the search target and local image portions for several stimulus dimensions is defined using a histogram-matchingtechnique. Second, the informativeness of these dimensions for a given search display is computed as an entropy-related function of thetarget-similarity "landscape". Third, as suggested by previous studies, more informative dimensions are assumed to have a greaterinfluence on attentional selection in visual search. We evaluated the model by having it predict the distribution of saccadic endpoints inanother experiment using real-world search displays. The predicted distributions revealed a strong similarity to the empirically observedones, suggesting that the model identifies the most important factors contributing to top-down attentional control in visual search.
Background Reading:Pomplun, M. (2006). Saccadic selectivity in complex visual search displays. Vision Research, 46, 1886-1900.http://www.cs.umb.edu/~marc/pubs/pomplun_vr2006.pdf">http://www.cs.umb.edu/~marc/pubs/pomplun_vr2006.pdf
Itti, L. & Koch, C. (2000). A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40, 1489-1506.http://ilab.usc.edu/publications/doc/Itti_Koch00vr.pdf">http://ilab.usc.edu/publications/doc/Itti_Koch00vr.pdf
Navalpakkam, V. & Itti, L. (2007). Search goal tunes visual features optimally. Neuron, 53, 605-617.http://ilab.usc.edu/publications/doc/Navalpakkam_Itti07n.pdf">http://ilab.usc.edu/publications/doc/Navalpakkam_Itti07n.pdf