Have you ever told a friend a made-up story to entertain that person or spare his or her feelings? Do you know anyone who confessed to you he or she overreported the number of hours worked to pad a paycheck? Some may think of these “white lies,” or small instances of dishonest behavior, as relatively harmless, a slight ethical lapse, when compared with full-scale corporate fraud. We may consider a white lie to be especially harmless if it is in service of protecting an important relationship. Researchers have studied the potential financial and legal consequences of such small instances of dishonesty as padding expense reports and pilfering pens. But are these consequences all that we should be concerned about? We examined the possibility that small instances of dishonest behavior have unintended consequences for our emotional intelligence—it seeps into our ability to read others’ emotions. Our research indicates the harm is real—and lasting.
In a series of studies, we concluded that an act of deceit can undermine a person’s ability to interact with peers, even those removed from the original lie. Specifically, we found that when people engage in dishonest behavior, they are less likely to see themselves as relational (for example, as a sister, friend, colleague or father) and are subsequently less accurate in judging the emotions of others. This investigation is a critical step in understanding the underlying interpersonal dynamics in organizations, specifically, because work relationships can be generative—a source of enrichment and vitality—or corrosive—a source of pain and dysfunction. The ability to accurately read and respond to others’ emotional states enables supportive, prosocial and compassionate behaviors, so it is particularly important for building strong networks in professional settings. Because of an increase in relational distance and a decrease in empathetic accuracy, those who are dishonest at work may experience a vicious cycle of mutual misunderstandings and missed opportunities for building supporting relationships, which could be detrimental for individuals, as well as for the organizations in which they work.
We began to explore these dynamics in a study of 250 pairs of individuals, comprised of a participant in an experimental condition—asked to lie or tell the truth—and a partner, with each tasked to assess the emotions of the other. We found that subjects who lied, as compared with the truth tellers, were less accurate in judging the emotions of their partner. Those in the dishonest group were not instructed to tell large lies; instead they were to make up a story about looking for a job—something that would amuse others or make them feel better about their own experiences with recruiting. The other half of the experimental participants were asked to tell a story based on their real experiences as job seekers.
After sharing these stories, all of those individuals listened to their partner tell a real story and then rated the emotions they felt. After sharing stories with each other, participants and partners reported their own emotions, as well as the emotions they sensed their counterpart was feeling. We used the reported emotions to calculate an accuracy score for a participant’s view of his or her partner’s emotions (the difference between the partner’s reported emotions and the participant’s report of that partner’s emotions). We found that subjects who were asked to be dishonest were significantly worse at accurately detecting the emotional state of their partner than those who told a true story. Surprisingly, these small, malice-free moments of dishonesty significantly clouded an individual’s ability to read emotions in subsequent interactions.
In conjunction with this investigation, we ran four additional experimental studies with two conditions: In one, we created specific circumstances where participants would be tempted to cheat. And in the other, we removed any possibility of cheating. All subjects took part in a die-throwing game that allowed them to earn a bonus, based on the number rolled: the higher the number, the more money earned. While all participants were asked to choose if their bonus would be based on the top or bottom side of each die before rolling it, only those in the honest group did so at that time. Those in the dishonest group recorded their selection after the roll, which allowed them to change it to the side corresponding to the maximum amount of money they could earn. They reported earning significantly more over the course of the game, suggesting they did indeed inflate their bonus payments dishonestly. After the die-rolling activity, the subjects watched 42 short video clips to assess their ability to read the emotions of others. In these clips, actors expressed a wide range of emotions in their face, voice and body language, and participants were asked to identify the affective state of the actors.
Across these four experimental studies, with 1,879 participants, we consistently found that those who were tempted and likely lied ended up performing worse on the empathetic accuracy test than those who did not have an opportunity to be dishonest. We also found that the effect was driven by a reduction in how relational dishonest participants considered themselves. People that engaged in dishonesty were less likely to describe themselves in terms of their relationships than those in the honest group. By being dishonest, subjects distanced themselves from others, which led to a reduced ability to read others’ emotions.
We ran an additional study to examine if the relationship between dishonesty and impaired empathetic accuracy can be seen outside the lab. In it, 250 full-time employees reported how frequently they engaged in dishonest behavior (for example, “There are times when I violate contract terms with customers”). These participants then engaged in a common test of empathetic accuracy, the Reading the Mind in the Eyes Test, which was developed by Simon Baron-Cohen of the University of Cambridge and his colleagues. In this experiment, across 36 trials, participants viewed the eyes of an actor and were asked which emotion best described his or her mental state. We found that the more frequently employees committed dishonest behavior at work, the lower they scored in empathetic accuracy, suggesting the two are negatively related.
There was one feature that inoculated individuals from this negative effect of dishonesty: In a lab study of 100 adults, we found that those who had a naturally high level of social sensitivity—attunement to subtle social-emotional cues in the environment—did not show significant reductions in their empathetic abilities following moments of dishonesty. But for the average participant across our studies, the negative effect was detected.
Importantly, we found that a reduction in empathetic accuracy as a result of dishonesty can have downstream consequences: specifically, participants who cheated for a financial gain were more likely to blatantly dehumanize the actors who appeared in these videos (that is, they rated the actors as less human) than those who did not have the opportunity to cheat. Moreover, cheaters were also more likely to engage in repeated unethical behavior. This result suggests that once we engage in dishonest behavior, we may also distance ourselves from other people by regarding them as less human, which allows us to continue down a path of subsequent, repeated unethical behavior. Our research implies that even small acts of dishonesty can go a long way, leaving ripple effects that may undermine a fundamental building block of our humanity: social connection.