Skip to main content
Menu
Why Humans Are Bad At Spotting Lies

Exactly how you’d expect a guilty person to act.” “Moving and credible.” “So coached and so rehearsed.” “Simply tremendous.”

These are all reactions to Supreme Court nominee Brett Kavanaugh’s testimony on Thursday before the Senate Judiciary Committee, a set of responses that put on full display just how differently people can interpret the same set of behaviors, statements and emotions. And that reality lines up well with what experts who study lying and lie detection would expect: Humans aren’t very good at being able to tell — just from watching someone and listening to them talk — whether they are being told truth or fiction.

Instead, research suggests, our interpretations of testimony like Kavanaugh’s, or Christine Blasey Ford’s earlier on Thursday, will be shaped by what we already believe. The Kavanaugh confirmation fight and Ford’s allegation that he sexually assaulted her are taking place in a political context, tapping into partisan identities. But even without those particular biases, humans just aren’t very good at reading people. And that’s why testimony is “no substitute for a good, solid, thorough investigation and finding of the facts,” said Brian Fitch, a psychologist and retired Los Angeles County sheriff’s lieutenant.

In other words, we were never going to get a better idea of whether Kavanaugh was telling the truth by watching him speak. (He’s denied all the allegations against him.) That’s just not how the human brain works, said Judee Burgoon, director of human communication research at the University of Arizona. That’s because our ability to identify a lie is poor — no better than chance, in fact. “The best estimate, and that’s from a lot of studies all accumulated, is that we’re about 54 percent accurate,” she told me. “That’s about equivalent to flipping a coin.”

Both she and Fitch said that there’s no twitchy tell, no revealing behavior, that is indicative of lying or truth-telling. Partly, Fitch said, that’s because behavior is culturally mediated. When we all live in the same culture, people who want to lie know what behaviors might make them look more or less credible, as much as the people who are watching for those behaviors.

Nor should it be shocking that we’re bad at accurately interpreting behavior and speech patterns, said James Alcock, professor of psychology at Canada’s York University. Learning is based on getting regular feedback, he told me. Try to add 2 + 2 and someone will tell you whether you got it right or wrong. Over time, that feedback allows you to know when you’re right. But there’s no systematic un-blinding to tell you when you correctly guessed whether you were being lied to. The feedback we get on this is spotty. Often there is none. Sometimes the feedback itself is incorrect. There’s never a chance to really learn and get better, Alcock said. “So why should we be good at it?”

Take people whose job it is to professionally detect lies — judges, police officers, customs agents. Studies show they believe themselves to be better than chance at spotting liars. But the same studies show they aren’t, Alcock said. And that makes sense, he told me, because the feedback they get misleads them. Customs agents, for instance, correctly pull aside smugglers for searches just often enough to reinforce their sense of their own accuracy. But “they have no idea about the ones they didn’t search who got away,” Alcock said.

The fact that we falsely believe we are good at interpreting trustworthiness from behavior has big implications. “The whole legal system is based on this idea,” Burgoon said. “We expect jurors to do this.” And the situation can be further complicated both by the way we go about asking questions and by our preconceived ideas. For instance, Fitch told me, Scotland Yard is famous for using a technique of asking people a rapid-fire series of questions about small details, assuming that truth tellers know the answers and liars will have to hesitate and remember their story.

But, Fitch said, that approach can also make interviewees more nervous and cause them to display the kind of hesitation interviewers consider a sign of lying — whether they’re lying or not. In other words, it’s possible to interview someone in a way that creates inconsistencies and credibility issues that weren’t there originally. Because of this potential, there have been efforts to change the way law enforcement officers conduct interviews, particularly of people from vulnerable groups, including victims of traumatic violence.

Then there’s the preconceived biases we all bring to the table. Affective polarization, for instance, is a fancy political science term that describes the growing tendency for Republicans to have good feelings about other Republicans, Democrats to have good feelings about other Democrats, and both groups to have negative feelings about the other. This kind of bias probably plays a big role in situations where we’re testing the trustworthiness of people under politically charged circumstances, and some studies have shown that it can have as strong an impact as the biases we carry related to race.

Given what we know about how humans interpret the behavior of other humans — and how bad we are at doing that accurately — it should be no surprise that there appears to be a strong partisan split in how both politicians and regular people viewed Kavanaugh’s testimony. In fact, Burgoon said, this is why you generally want more layers of information in an investigation. You’re not going to learn the “truth” based on somebody’s body language. “I think that’s part of the desire for an FBI investigation, because the FBI would produce a more impartial rendering,” she said. Indeed, Republican Sen. Jeff Flake, a crucial swing vote, asked on Friday for the full Senate vote on Kavanaugh to be delayed a week so that the FBI could produce just such a rendering. Of course, as Burgoon added, not everyone is going to believe the FBI’s findings either.

Maggie Koerth-Baker is a senior science writer for FiveThirtyEight.

Comments