Notice that, even as you fixate on the screen in front of you, you can still shift your attention to different regions in your peripheries. For decades, cognitive scientists have conceptualized attention as akin to a shifting spotlight that “illuminates” regions it shines upon, or as a zoom lens, focusing on things so that we see them in finer detail. These metaphors are commonplace because they capture the intuition that attention illuminates or sharpens things, and thus, enhances our perception of them.

Some of the important early studies to directly confirm this intuition were conducted by NYU psychologist Marisa Carrasco and colleagues, who showed that attention enhances the perceived sharpness of attended patterns. In their experiment, participants saw two textured patterns presented side-by-side on a computer screen, and judged which of the two patterns looked sharper. However, just before the patterns appeared, an attention-attracting cue was flashed at the upcoming location of one of the patterns. They found that attended patterns were perceived as sharper than physically identical unattended patterns. In other words, attention may make physically blurry (or otherwise degraded) images appear sharper – much like a zoom lens on a camera. 

Subsequent studies by Carrasco’s group and others found that attention also enhances perception of other features – for example, color saturation , orientation , and speed . This research suggests that attention causes incoming sensory information from attended locations to be processed more fully, without changing the information itself.

However, some recent work at the cutting edge of vision science has begun to chip away at this intuitive notion, suggesting that attention can sometimes distort perception, changing the very character of our visual experience. In onestudy , Satoru Suzuki of Northwestern and Patrick Cavanagh of Harvard presented observers with a display in which two vertical lines were drawn one underneath the other, and asked them to judge whether the top line occurred to the left or right of the bottom line. Just before the lines were presented, an attention-attracting cue was briefly flashed. When the cue appeared near the lines, attention caused a curious distortion of perceived space, shifting the perceived locations of the lines away from the location of the cue. For example, on trials where the lines were perfectly aligned, the attentional cue caused them to be perceptually misaligned. The authors named this the attentional repulsion effect, since it seems attention repels the perceived locations of neighboring objects away from the focus of attention.

In explaining these results, Suzuki & Cavanagh cite neuroscience research  showing that shifts of attention move the receptive fields of neurons in the peripheries towards the focus of attention. For example, if you pay attention to a bottle of water before you, then neurons whose receptive fields would normally fall outside the edges of the bottle will be recruited to gather information about the bottle as well. One consequence of having so many additional neurons encoding the space within and immediately surrounding the bottle is that this space may be “overrepresented”, possibly explaining why a newly appearing object in this vicinity is perceived as further away from the bottle than it truly is.

A recent study by Fuminori Ono and Katsumi Watanabe has shown that these attentional effects are even more complicated than this theory suggests. Their experimental design perfectly mirrored that of Suzuki & Cavanagh, except that sometimes the attention-tracking cue could occur after the presentation of the vertical lines. Surprisingly, this post-attentional cue still distorted the perceived location of the upper line, causing it to be perceptually shifted toward the region where attention had been cued— an attentional attraction effect. In other words, attention can distort the perceived location of objects even after they are no longer physically present.

In fact, it turns out that attention seems to dramatically reorganize the spatial relationships between objects throughout our visual field. One of us (Brandon Liverence) conducted a study together with Brian Scholl at Yale University in which participants had to pay attention to two particular circles (the targets) on a computer screen, while also selectively ignoring two other identical-looking circles (the distractors). During what is called a multiple-object tracking (MOT) task, participants had to keep track of the targets as they moved haphazardly among the distractors, and then re-identify them many seconds later.

At the end of the trial, the objects all disappeared and participants used the mouse to indicate the objects’ last perceived locations. Their responses revealed two striking effects: they misperceived the targets as being closer together than they really were, and the distractors as further apart than they really were. That is, attention simultaneously compresses perceived space between targets (as if they are attracting each other) and expands perceived space between distractors (as if they are repelling each other). Given that everything in the visual world ultimately consists of objects and space, these findings have far-reaching implications for how we perceive just about anything we can see. Since we are almost constantly paying attention to space and objects, this means that space and objects are almost constantly being distorted perceptually.

Another (perhaps disturbing) implication of this research is that we can’t completely trust our eyes to give us a true reflection of the world. This realization creates quite a philosophical conundrum, if not a practical one. It’s a bit like finding out that all along you’ve been seeing the world through a pair of “distorting glasses.”

These studies suggest a fundamental revision to our naïve concept of attention: even though we experience attention as a spotlight that enables perception, it may be more accurately described as a distorting zoom lens — both sharpening and warping our perception of the visual world.

Are you a scientist who specializes in neuroscience, cognitive science, or psychology? And have you read a recent peer-reviewed paper that you would like to write about? Please send suggestions to Mind Matters editor Gareth Cook, a Pulitzer prize-winning journalist at the Boston Globe. He can be reached at garethideas AT gmail.com or Twitter @garethideas.