Why Big Liars Often Start Out as Small Ones
People who tell small, self-serving lies are likely to progress to bigger falsehoods, and over time, the brain appears to adapt to the dishonesty, according to a new study.
The finding, the researchers said, provides evidence for the “slippery slope” sometimes described by wayward politicians, corrupt financiers, unfaithful spouses and others in explaining their misconduct.
“They usually tell a story where they started small and got larger and larger, and then they suddenly found themselves committing quite severe acts,” said Tali Sharot, an associate professor of cognitive neuroscience at University College London. She was a senior author of the study, published on Monday in the journal Nature Neuroscience.
“在他们的叙述中，起初发生的不过是些小事，但事情越来越大，然后他们突然发现自己做出了极为恶劣的举动，”伦敦大学学院(University College London)认知神经学助理教授塔利·沙罗特(Tali Sharot)说。她是周一发表在《自然神经科学》(Nature Neuroscience)杂志上的这篇论文的资深作者。
Everyone lies once in a while, if only to make a friend feel better (“That dress looks great on you!”) or explain why an email went unanswered (“I never got it!”). Some people, of course, lie more than others.
But dishonesty has been difficult to study. Using brain scanners in a lab, researchers have sometimes instructed subjects to lie in order to see what their brains were doing. Dr. Sharot and her colleagues devised a situation that offered participants the chance to lie of their own free will, and gave them an incentive to do so.
A functional MRI scanning device monitored brain activity, with the researchers concentrating on the amygdala, an area associated with emotional response.
Participants in the study were asked to advise a partner in another room about how many pennies were in a jar. When the subjects believed that lying about the amount of money was to their benefit, they were more inclined to dishonesty and their lies escalated over time. As lying increased, the response in the amygdala decreased. And the size of the decline from one trial to another predicted how much bigger a subject’s next lie would be.
These findings suggested that the negative emotional signals initially associated with lying decrease as the brain becomes desensitized, Dr. Sharot said.
“Think about it like perfume,” she said. “You buy a new perfume, and it smells strongly. A few days later, it smells less. And a month later, you don’t smell it at all.”
Functional imaging is a blunt instrument, and the meaning of fluctuations in brain activity is often difficult to interpret. Dr. Sharot agreed that the study could not determine exactly what type of response the decreased activity in the amygdala represented.
“We know for sure it’s related to lying,” she said. “Whether it’s their negative emotional reaction, that’s only speculation, based on the parts of the brain we looked at.”
But the researchers included numerous checks on the study’s results and replicated some parts of it before publication. The research was led by Neil Garrett, a doctoral student at University College London at the time. Dan Ariely of Duke University and Stephanie C. Lazzaro of University College London were also authors of the report.
但在论文发表前，研究人员就研究结果做过大量核对工作，并重复得出了部分研究结果。这项研究的牵头人尼尔·加勒特(Neil Garrett)当时是伦敦大学学院的一名博士生。杜克大学(Duke University)的丹·阿雷利(Dan Ariely)和伦敦大学学院的斯蒂芬妮·C·拉扎罗(Stephanie C. Lazzaro)也是该研究报告的作者。
Christian Ruff, a professor of decision neuroscience at the University of Zurich, noted that in previous research, it had been “really, really difficult to characterize the neural processes that underlie purposeful lying.”
苏黎世大学(University of Zurich)决策神经科学教授克里斯蒂安·拉夫(Christian Ruff)指出，在以往的研究中，一直“非常非常难以描绘故意撒谎背后的神经变化过程”。
The new study, he said, provided one way of doing that, and showed the importance of considering the emotional component of dishonesty.
Amitai Shenhav, a psychologist at Brown University who has studied moral decision-making, also praised the study, calling it “nicely executed.”
布朗大学(Brown University)致力于研究道德决策问题的心理学家阿米塔伊·舍恩霍(Amitai Shenhav)也赞扬了这项研究，说它“开展得很好”。
He said the findings were “suggestive of a slippery slope.” But he added that it was still not entirely clear what was driving people down that slope.
For example, Dr. Shenhav said, it could be that the act of lying by itself increased the propensity for acting dishonestly, “like gradually pushing our foot off a brake.” Or that the subjects, who were not punished in any way for their dishonesty, concluded that lying in that environment was not so bad.
“We need to be cautious when generalizing to real-world dishonesty that is typically associated with threats of reprimand” or damage to someone’s reputation, he said.
In the study, the subjects — 80 adults, most of them university students — were asked to help the unseen partner guess the number of pennies in the jar. The partner, the subject was told, would then tell the researchers the guess. (The partner was in reality a confederate of the scientists.)
In some cases, the subjects were given an incentive to lie: They were told that they would be paid more if their partners overestimated the money in the jar, and that the higher the overestimation, the more they would be paid. Their partners’ payments, however, would depend on how accurate the estimates were.
In other cases, the participants were told that both they and their partners would be paid more for overestimating the number of pennies; still others were told that their payments depended on the accuracy of the estimates, while their partners would be paid more for overestimating.
Dr. Garrett said he hoped that the study could be repeated in other, more realistic settings, and that another study could be done to look at what might stop people from escalating their dishonesty.
“How do you stop it? How do you prevent it?” he asked.
But Dr. Ruff said that if the findings from this study held up, the message seemed clear.
“The implication is that we should watch out that we don’t tolerate lies, in order to prevent people from lying when it really matters,” he said.