Imagine you’re driving home from work. The streets are busy but you’re cruising along smoothly. Now, visualize two scenarios: In the first, your phone starts ringing, vibrating and lighting up as you receive an incoming call. In the alternative scenario, you notice a brief blink on your screen as a text message arrives. Which would be more likely to affect your driving?

You might think the phone call would be the bigger diversion. A study published today in Current Biology suggests just the opposite. Working with 95 volunteers, psychologists Jeff Moher and Joo-Hyun Song at Brown University, along with Brian Anderson at Johns Hopkins University, found that subtle distractors change what we are doing more than obvious ones. But they do not have the same effect on what we see.

Distractions come in all shapes and sizes—bright colors, blinking lights and loud sounds—and the more these “pop out” from their surroundings, the more likely they are to capture our attention. People experience distractions differently. When someone finds a thing particularly rewarding—alcohol, tobacco or money—it can capture their attention in a way that is unique to them. Previously, researchers have found that alcoholics are more likely to notice objects changing in a scene if they are related to liquor.

In the current study the researchers wanted to see how different degrees of distraction influence what we see as against what we do. After all, distractions that appear while we passively observe a TV screen may affect us differently than when we steer a car.

To do this, they conducted four experiments. In the first two the researchers created a small and large distraction using rewards by giving people two cents for pointing to a red circle and 10 cents for indicating a green one. This exercise taught them that the green circle was worth more. After learning the association, one group of participants completed a task that involved pointing to a unique shape on a screen filled with gray objects. A second group had to carry out a task in which they pressed a key to indicate whether a line that appeared amid various shapes was oriented horizontally or vertically. In both tasks the more valued green circle or the less valuable red circle would appear on half the trials—the other half when nothing appeared was used as a control comparison.

In the reaching task there was a greater deviation in the hand movement—toward the location of the distractor—when it was linked with the smaller reward than the larger one. In the button-pressing task, the opposite occurred—participants performed worse with the stronger distractor present than with the weaker one. In other words, during an action the smaller, less important distractor was more attention-grabbing whereas the opposite held true during the task that required just watching what was happening.

The same action and perception tasks were repeated with two more groups, except rather than using rewards as distractions, the researchers used color. In these trials all objects that appeared on the screen were red and distractors were either blue or pink. Blue, which differs more from red than pink does, was a larger distraction. The less distracting pink objects diverted the reaching action more than the more distracting blue ones. The blue objects, in turn, reduced perceptual accuracy more than the pink ones did. Seeing the same results with both reward and color suggests that the effects can be generalized across different types of distractions.

According to Tim Welsh, an experimental psychologist at the University of Toronto not involved in the research, the findings could change the way researchers understand what influences action and attention. “It doesn’t necessarily mean that we need to go back to old models where perception, attention and action are separable processes, but it leads to this idea that the way the action system expresses salience is different than the perceptual system,” he says.

Why is there a difference in how we react to distractions? One possible answer is that the brain has a suppression mechanism preventing distractors from interrupting tasks—for example, reaching for the phone when driving. But this mechanism only kicks in at a certain threshold, allowing subtle distractors to slip through the cracks. “We haven’t pinned down what the mechanism is, but finding out what the neural correlates of these are is our next step,” Song says. If this suppression mechanism exists, it appears to be present in the action system but not the perceptual system. Our brains may have evolved this way because actions usually have bigger consequences for survival. “Things that are adaptive usually have to do with finding food or avoid being food,” Moher says. “So if you see a distraction in the environment that is not relevant, it’s far more important that your action doesn’t go toward the distractor.”

Moher suggests that the distinction between action and perception becomes more relevant as we transition from laptops to tablets, where we’re going from indirectly to directly interacting with a screen. This may be particularly important in jobs where attention to detail is crucial, such as airport security or radiology. Consider the case of scanning a bag for explosives or looking for a tumor in a medical scan—perhaps having someone draw outlines around suspicious objects would create a vulnerability to smaller distractions, whereas passively scanning through an image might enhance susceptibility to large ones. Taking into account these subtleties might help improve the accuracy of these searches.

Although noticing immediate threats in our environment, such as an oncoming car, is important, most stimuli around us are not relevant to the task at hand. Our brains have developed a way to block out distractions—but it seems that it is the ones we do not notice that cause us to deviate most from our objectives.