A central goal in the study of object and scene perception is to understand how visual information is integrated across views to provide a stable, continuous experience of our environment. Research on issues ranging from visual masking to priming across saccades to the representation of spatial layout across views has addressed the issue of what information is preserved from one view to the next. Recently, research on visual memory for objects and scenes has led to striking claims about the nature of the information that is and is not preserved from one instant to the next. For example, studies of change blindness have shown that striking changes to objects and scenes can go undetected when they coincide with an eye movement, a flashed blank screen, a blink, or an occlusion event. These studies suggest that relatively little visual information about objects and scenes is combined across views. Despite these failures of change detection, observers somehow manage to experience a stable, continuous visual environment. This special issue seeks to unite recent studies of change blindness with studies of visual integration to better understand the nature of our representations and the richness of our visual memory.
Dan Simons is a professor in the Department of Psychology and the Beckman Institute at the University of Illinois. He earned his BA in psychology and cognitive science from Carleton College and his PhD in experimental psychology from Cornell University. He then spent five years on the faculty at Harvard University before moving to Illinois in 2002.
Simons's scholarly research focuses on the limits of human perception, memory, and awareness, and he is best known for his research showing that people are far less aware of their visual surroundings than they think. His studies and demonstrations have been exhibited in more than a dozen science museums worldwide.
In his spare time, he enjoys juggling, bridge, and chess.