Can’t read this email? Click here to view online.
Global Engagement Center

GEC Counter-Disinformation Dispatches #4

April 14, 2020

What Works in Debunking

As pointed out in GEC Counter-Disinformation Dispatches #3, debunking does work, to varying degrees with different audiences.  But how you do it matters a great deal.

First, it is important to emphasize that debunking is not the same as denying.  When you deny something without showing why it is false, you are basically saying, “Take my word for it.”  That might work for audiences who believe you unreservedly.  But for those who are more skeptical, one must demonstrate why a claim is false while also discrediting the source of the false allegations, if possible.

 

Debunking Tips

Image of Myths and Facts

Some good general rules on debunking have been laid out by John Cook and Stephan Lewandowsky in “The Debunking Handbook.” 

Cook and Lewandowsky point out that, in debunking, “less can be more.”  They state:

Common wisdom is that the more counter arguments you provide, the more successful you’ll be in debunking a myth. It turns out that the opposite can be true. When it comes to refuting misinformation, less can be more. Generating three arguments, for example, can be more successful in reducing misperceptions than generating twelve arguments, which can end up reinforcing the initial misperception.  (Image credit: skepticalscience.com)

As Cook and Lewandowsky warn, “A simple myth can be more cognitively attractive than an over-complicated correction.”

It is also advisable to focus on the weakest parts of the disinformer’s argument and use them to discredit the overall claim.  It is not necessary or advisable to try to debunk every aspect of the false claim. 

Debunking disinformation often involves very controversial issues.  If you advance an argument that is the least bit shaky, your opponent may be able to use this to destroy your credibility.  It is better to advance only rock-solid arguments. 

Cook and Lewandowsky also note that “any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false.”  “False” or its equivalent should be the debunker’s favorite adjective. 

One should also describe false claims in the plainest, most inoffensive way possible and avoid repeating lurid allegations, which can stick in people’s minds even if they are untrue.

For example, it is better to describe AIDS disinformation simply as “the false claim that the U.S. government invented the AIDS virus” rather than saying “the false claim that the AIDS virus was created as part of ‘Pentagon's experiments to develop new and dangerous biological weapons,’” as the original Soviet disinformation story falsely claimed.  We are under no obligation to repeat the disinformer’s most inflammatory accusations.

Debunking explanations should also be easy to understand.  In countering the false claim that the U.S. government invented the AIDS virus in the 1970s, one counter argument was that such a feat of genetic engineering was impossible at that time.  That is true, but how would the average person know that?  It was better to point out that a blood sample taken from a person who died in Kinshasa, Congo in 1959 later showed antibodies to the AIDS virus, demonstrating convincingly that the disease had existed earlier than the 1970s.  That is easier to understand and accept.  The AIDS virus could not have been invented in the 1970s, as the Soviets falsely claimed, if people already had the disease in 1959.

As Cook and Lewandowsky point out, another very important thing to do is to provide an explanation for why people believe the false claim, if possible.  The final section below, “Exaggerated Fears about Depleted Uranium,” provides an example of this.

 

Discredit as well as Debunk

Disinformation must, of course, be debunked – exposed as false.  But discrediting those who spread the disinformation is just as important, perhaps even more so. 

The most efficient and effective way to accomplish this is to look for the most ridiculous claim a disinformer has made and use this to undermine their credibility, assuming that they have a track record of regularly spreading lies, which is typically the case.

After all, in everyday life, we use a similar method to evaluate the trustworthiness of others.  If a person repeatedly tells us lies, we may be fooled initially but eventually we recognize them as a liar and, after that, tend to be very skeptical about what they say. 

If it can be convincingly demonstrated that a person, publication, or organization regularly publishes lies or fantasies, this undermines their credibility, making it easier to discredit the allegations they are spreading.

In addition, exposing highly offensive statements made by those who spread disinformation undermines their credibility.  For example, Dmitry Kiselev, the main presenter on Russian state-owned television channel Rossiya 1, once said the hearts of homosexuals killed in car accidents “should be buried or burnt as unfit for prolonging anybody's life.”  Such statements should give pause to those who might mistake Kiselev for a reliable, objective reporter.

 

Tell Stories and Follow the Rules of Good Communication

Debunking is a form of communication and following the rules of good communication is always wise.  They include:

  • keep your explanations short and simple
  • leave plenty of white space on the page 
  • use boldface to emphasize key points (but don’t overdo it)
  • include vivid, memorable examples, comparisons, and metaphors when appropriate 
  • write like you talk, although with proper grammar
  • when you can, use visuals
  • if possible, tell a story, with drama. 

A January 9, 2020 opinion article in The New York Times provides a vivid example of how graphics communicate better than words when discussing statistics.  The article compares the dangerous effects of measles, the flu, and cervical cancer with the minimal side effects of preventative vaccinations for these diseases.  The graphics make it abundantly clear that the effects of the diseases are much worse than the possible side effects of the vaccinations.  As the popular saying goes, “seeing is believing.”  Take a look at the graphics and judge for yourself. 

Making comparisons to familiar things from everyday life can be a very powerful persuasive technique.  An article discussing this subject contrasts the following statements:

  • “A large state does not behave at all like a gigantic municipality.”
  • “A large state does not behave at all like a gigantic municipality, much as a baby human does not resemble a smaller adult.”

The second statement, taken from Nassim Taleb’s book Antifragile, is more persuasive because it draws on our knowledge of familiar experiences from everyday life.  We all know that babies differ from adults in many ways.  This makes it easier to accept the statement about entities with which people may not be as familiar, in this case national and local governments. 

 

Stories

Image of a grandmother reading a story

The power of stories has long been recognized.  In his book Shaped by the Story, author Michael Novelli says:

We are story-minded beings.  Instinctively, we filter life into stories to provide structure, content and relevance in our minds. …

Stories provide a framework for our lives; they order our memories and assign value to our experiences.  Therapist Lewis Mehl-Madrona [says]: “Stories hold a richness and complexity that simple declarative facts can never grasp …. Story provides the dominant frame for organizing experience and for creating meaning out of experience.  (Image credit: Shutterstock)

We all know that stories have special appeal.  It is unlikely that any child has ever asked, “Mommy, can you tell me some bedtime facts?” 

One of mankind’s primal stories is the timeless morality tale of good vs. evil, which is at the core of disinformation and countering it.  This is what makes it so powerful.

 

Can Truth Compete with Lies?

Street signs saying "Lie" and "Truth"

A landmark article in Science in March 2018, "The spread of true and false news online," concluded that lies, especially about political topics, spread much faster and further online than truth, as measured on Twitter.  The finding received a great deal of attention, but one must look more closely to learn the real lessons.  (Image credit: vagabondfinances.com)

The article noted “false news was more novel” and “false stories inspired fear, disgust, and surprise,” observing that these factors may account for the greater spread of false stories.

Do this thought-experiment.  Imagine two statements, one true and one false:

  1. The false one is: “Section 123 of the United States Internal Revenue Code, which governs federal taxation, states: ‘followed by complicated, boring, and very difficult to understand language from a different section of the Code.’”
  2. The true one is: “Section 123 of the United States Internal Revenue Code, states: ‘followed by complicated, boring, and very difficult to understand language from section 123 of the Code.’”

Do you think the false statement would spread more quickly on social media than the true statement?  It’s doubtful.  It’s more likely than neither statement would spread at all because both would be boring and of no interest to anyone.

So, do lies spread faster and further than truth?  It depends.  The Science study finds that surprising, disgusting, fear-inspiring lies spread faster than less exciting facts.  But do they spread faster because they are untrue or because they are shocking?  We suspect the latter.

The lesson for those countering disinformation: to be practical, fight sensationalism based on falsehoods with sensationalism based on facts. 

The fact that the Russians and other disinformers are spreading every imaginable lie about those who oppose them is disgusting, inspires fear, and is a surprise to many.  This truth is shocking and will spread if you tell the story well.

 

Recognize the Power of Associations

In order to counter disinformation effectively, one must also recognize that the mind often reasons by associations rather than logic.

In his book Thinking, Fast and Slow, Nobel Prize-winning psychologist Daniel Kahneman describes the mind as an “associative machine,” noting that “ideas that have been evoked trigger many other ideas, in a spreading cascade of activity in your brain.”

This is part of what Kahneman calls “thinking fast,” which comes easily and automatically and “requires little or no effort.”

He contrasts this with “thinking slow” – logically and deliberately, which he says is much more difficult and time-consuming.

We’re not aware of this, Kahnemen says.  He notes, “When we think of ourselves, we identify with … the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do.”

But actually, he says, most decisions are made by the part of our brain that thinks quickly, effortlessly, and largely uncritically, often on the basis of associations we are seldom even aware of.  “Most of the work of associative thinking,” he says, “is silent, hidden from our conscious selves.”

 

Exaggerated Fears about Depleted Uranium

Image of a Depleted Uranium Armor Piercing Fin Stabilized Discarding Sabot

Recognizing the power of associations is very useful when trying to debunk false claims about highly controversial topics, such as the amount of danger posed by depleted uranium.

There are many exaggerated fears about depleted uranium (DU), which is used in several forms of ammunition by U.S. armed forces.

The typical fears are that depleted uranium will cause cancer, birth defects, or other illnesses.  (Image credit: www.wikiwand.com)

The facts show otherwise.  Numerous studies have found “no evidence that DU causes cancer”:

  • The World Health Organization report Depleted Uranium: Sources, Exposure, and Health Effects, stated: “no increase in leukemia or other cancers has been established following exposure to uranium or depleted uranium.” (p. 132)
  • A European Commission report concluded, “radiological exposure to depleted uranium could not result in a detectable effect on human health.”
  • A NATO study found that, “based on the data today, no link has been established between depleted uranium and any forms of cancer.”

Even if these facts are brought to people’s attention, they often disregard them.

The reason is the powerful, largely subconscious associations in people’s minds with the word “uranium.”

When people hear the word “uranium,” many think of the atomic bomb, Hiroshima, mass deaths, radioactive fallout, radiation sickness, cancer, birth defects, and other terrifying associations.

The “thinking fast” part of the brain processes these very powerful associations immediately, without any conscious effort.  This happens so effortlessly that, for the most part, we are not even aware that it has occurred.

In many cases, these powerful associations overwhelm facts, however solidly researched, that point out that depleted uranium is not as dangerous as it seems.  Fear and negative associations can easily overpower facts and logic.

But there is a way to counteract this natural process.

If the subconscious associations are raised to the conscious level – by pointing out that the mind automatically makes such connections – then people can examine these associations rationally, and perhaps recognize that they have a powerful, largely subconscious effect.  This can lead to a “cognitive opening” in which people may acknowledge that their minds have been subconsciously influenced by negative associations and recognize they should perhaps give the facts a second look.

As Kahneman points out, the mind’s ability to think slowly, logically, and factually “normally has the last word” – but only if it is engaged. 

Russia continues to fan fears about DU.  In September 2019, the Russian ambassador to Serbia said, “Russia is also ready to investigate the consequences of the NATO bombing, which are more than obvious to cancer patients.”  DU was used in NATO bombardments of Yugoslavia during the 1998-1999 Kosovo war, and in the conflict in Bosnia.

 

For more, see:

Next issue: “The Coronavirus and Disinformation”

Past issues:

To contact us, email: GECDisinfoDispatches@state.gov