Monday, October 1

Sharing Misinformation: Why Big Lies Stick

Psychologists from the University of Western Australia, University of Michigan, and University of Queensland recently published a new abstract that delves into the psychology of misinformation, and why people are more apt to believe falsehoods over accurate information. (Hat tip: Farron Cousins.)

The simple answer? Believing misinformation requires less brain power. But there is something else that is striking to consider, especially because people are resistant to correct misinformed beliefs.

Misinformation is simple, memorable, and emotional. 

The attacks on two U.S. embassies that resulted in the deaths of four Americans provide an example. The initial reports attributed the attacks to a spontaneous reaction to the inflammatory anti-Muslim film by Sam Bacile. The U.S. government initially cited the film as the primary cause.

However, it has now become clear that the attack on the consulate in Libya was not spontaneous. It was a planned act of terrorism believed to be led by militant Ansar al-Shariah and al Qaeda. Although the administration knew it was a terrorist attack within 24 hours after it occurred (and possibly before the attack), it continued to link the attack to the film for a week.

Focusing on the film has given it even more credence and escalated tensions in the Middle East. So why did the administration do it? Possibly, in part, because the misinformation was easier to report.

Misinformation tends to be grounded in an emotional appeal whereas the truth tends to be grounded in logical appeal. The truth requires more reason and deliberation. The cause-and-effect model applied to the film is easy to believe. It requires no thought. The act of terrorism, on the other hand, requires deliberate thinking because the administration has consistently suggested that al Qaeda has all but lost, the administration's foreign policy is sound, and that Americans are safer today.

In essence, because accurate information requires people to reassess other administration "truths," it is more difficult to believe that this was an emotional reaction caused by the film. Unfortunately, the unintended consequences of this misinformation have now fanned real protests across the Middle East. As a result, it has given rise anti-American sentiment once again.

If misinformation has the advantage, what can we do about it?

Misinformation isn't used exclusively by governments and politicians. It impacts communities, industries, companies, and individuals every day. Although the abstract suggests that the cause is linked to rumors, governments, vested interests, and media (including the Internet), their more compelling point is psychology. People have no real safeguards against it.

Specifically, the researchers say that most people look for information compatible with what they believe, how coherent the story might be, whether the source is credible, and how many other people believe it. These strategies do not guard against misinformation. In fact, they often compound it.

Having a presumably credible source deliver a well-crafted story to people who are likely to believe it (and the more the better) is the recipe for propaganda. When you look at several crisis communication studies, almost all of them include some of these criteria to spread misinformation, intentional and accidental, whether they are proponents or detractors.

In many of the case studies I've covered, there does tend to be a short-term lift associated with misinformation, which is then followed by long-term consequences. In most cases, credibility erodes until nobody believes the fraudulent source anymore (even when they do tell the truth).

This is one of several reasons I frequently teach public relations students that the truth is hard enough. There is never any good reason to compound a crisis with misinformation. It's hard enough to tell the truth because, as the abstract alludes, misinformation is difficult to retract and nearly impossible to erase.

In fact, it is so difficult to manage, the conclusions in the abstract represent the researchers' weakest points (along with a tendency to show other bias in their examples). I think a few communication tenets can do better than the abstract (and they will follow on Wednesday). But in the meantime, we need to appreciate that the first step is always the same.

We have to reduce our own susceptibility to misinformation. 

Much like journalists used to do (and some still do), objectivity needs to be considered a skill set. This means we have to develop the ability to put aside personal beliefs, seek out opposing points of view, ferret out facts regardless of how coherent the information might be, ignore the so-called credibility of sources until the evidence bares out, and never mistake "mass appeal" as an authority.

Some journalists I've met along the way have become bold in their belief that being objective is a myth. I disagree. So does reporter and correspondent Brit Hume, who recently noted that attorneys develop objectivity as a skill set in order to successfully understand both sides of a case. It's a reasoned analogy.

For public relations practitioners specifically, it's especially important to strive for objectivity because it helps us develop empathy for the publics beyond the organization. It's important because even if our opposition is wrong, we have to understand their point of view and find mutual ground if it exists.

Ergo, only once we've reduced our own susceptibility to misinformation can we ever hope to have a chance to manage it. If we don't, then we're equally likely to become the source of falsehood as opposed to the trusted source that most professionals hope to become. Start with that.
blog comments powered by Disqus

Blog Archive

by Richard R Becker Copyright and Trademark, Copywrite, Ink. © 2021; Theme designed by Bie Blogger Template