Skip to main content

In January 2024, the World Economic Forum released Global Risks Report 2024, which concludes that “the most severe global risk anticipated over the next two years” is that “foreign and domestic actors alike will leverage misinformation and disinformation to further widen societal and political divides.” The report reflects “the views of over 1,400 global risks experts, policy-makers and industry leaders surveyed in September 2023.” It warns:

The disruptive capabilities of manipulated information are rapidly accelerating, as open access to increasingly sophisticated technologies proliferates and trust in information and institutions deteriorates. …

… easy-to-use interfaces to large-scale artificial intelligence (AI) models

have already enabled an explosion in falsified information and so-called ‘synthetic’ content, from sophisticated voice cloning to counterfeit

websites. …

Synthetic content will manipulate individuals, damage economies and fracture societies in numerous ways over the next two years. (p. 18)

“The speed and effectiveness of regulation is unlikely to match the pace of development,” the report states.

My Experience

I tracked, analyzed, and countered foreign disinformation (deliberate false stories), anti-American misinformation (unintentional mistakes), conspiracy theories, and other false stories for the U.S. government, mostly at the former U.S. Information Agency (USIA) and later the U.S. Department of State, for some 25 years, starting in 1987 and ending in 2022 (I also worked on other tasks during that time).

I countered:

  • Soviet disinformation from 1987 to 1991
  • Russian disinformation starting in 2015
  • Iraqi propaganda and disinformation during the 1991 Gulf War
  • False claims of child organ trafficking from 1987 to 1996
  • Exaggerated fears about depleted uranium, which gained prominence after the 1991 Gulf war
  • Many other false claims of concern to U.S. embassies in the late 1980s, 1990s, and 2000s.

In 2022 and 2023, while a contractor at the U.S. Department of State’s Global Engagement Center (GEC), I wrote 14 GEC Counter-Disinformation Dispatches, which sought to sum up what I had learned about disinformation and misinformation and how to counter them.

Countering False Stories

The basic approach I used to counter false stories is to:

  1. Totally immerse oneself in the “raw material” of the allegations, doing as much research as one can in the time available, sorting fact from fiction.
  2. Succinctly debunk a small number of key false claims, highlighting ones that can clearly be shown to be absurd.
  3. When countering disinformation, discredit those who repeatedly spread false claims by illuminating their track record of boosting lies.
  4. Explain, as best one can, why people tend to find the false claims believable.
  5. Shift the frame of discussion from the false claims to topics #3 and #4, focusing on the misdeeds of those who spread false claims and the very interesting issue of why many people believe things that aren’t true.

Several examples follow.

Depleted Uranium and the Power of Subconscious Associations

After the U.S. use of depleted uranium munitions in the 1991 Gulf War, hysteria began to spread about the effects of these weapons on health – mainly false claims that they had caused cancers and birth defects in Iraq.

I read several authoritative reports on the health effects of depleted uranium (DU), which concluded that it “poses minimal risk to human health.”  But I found that these facts had little impact; people simply believed that DU must be harmful.  Upon reflection, I realized that these exaggerated fears were likely due to the very alarming associations that many people have with the word “uranium,” some of which are: atomic weapons, Hiroshima, radioactive fallout, cancer, and birth defects.  I concluded that such very powerful subconscious associations rendered the facts irrelevant and that the only way to debunk unwarranted fears about DU was to raise these subconscious associations to the conscious level and address them directly.  Once that had been done and people realized the power of these unconscious associations, this could create a cognitive opening, in which people would consciously set aside their previously subconscious fears and take an fresh look at what medical research said about the health impacts of DU.  Without such a process, the facts had virtually no impact.  Powerful subconscious associations with the word uranium had to be addressed first before facts could have an impact.

Framing Child Organ Trafficking Rumors as an Urban Legend

Rumors of child organ trafficking went viral in the late 1980s, spreading widely and rapidly despite the fact that they had no basis in fact.  While researching these rumors, French folklorist Veronique Campion-Vincent suggested to me that I think of these rumors as not simply false, but as an urban legend, a modern form of folklore, and frame the discussion about them this way.  She wrote that the child organ trafficking myth:

is a new — updated and technologized — version of an immemorial fable. The core of the fable is that a group’s children are being kidnapped and murdered by evil outsiders.  Accusations of such kidnappings and ritual murders were made against Christians in ancient Rome [and against] Jews throughout antiquity, the Middle Ages, and up to modern times….  Child abductions in 18th century France were explained by ailing nobility who needed them for medical reasons: the leprous King needed blood baths, or a mutilated Prince needed a new arm which incompetent surgeons were trying each day to graft from a new kidnapped child ….  “The Baby-Parts Story: A New Latin American Legend,” Western Folklore, January 1990)

Most of us are familiar with such urban legends, one definition of which is “an often lurid story or anecdote that is based on hearsay and widely circulated as true.” Framing the child organ trafficking allegations this way, and providing some familiar examples of widely-known urban legends, prepares the reader’s mind for a lurid, shocking story that will be shown to be false.

Fears about Violations of the “Sacred” in the Run-up to the 1991 Gulf War

A core competency needed in countering misinformation and disinformation is knowledge about how the mind can create false stories as a way to express unspoken primal fears.  An example occurred during the run-up to the 1991 Gulf War, in which a baseless rumor swept the Palestinian territories.  A Washington Post article quoted Khalid Abu Toameh, an Arab Israeli journalist “who for several years has covered Jerusalem and the West Bank for a variety of Israeli media.”  He stated:

“For two months the East Jerusalem press,” which has been sympathetic to Iraq, “was printing stories about the desecration of the Moslem holy places, about how American soldiers in Saudi Arabia were walking naked around Mecca and throwing beer cans on the prophet’s tomb. It sounds funny, but these reports were believed by many people, and so there was even greater passion than usual about defending the holy places here.”

In reality, U.S. troops were stationed on military bases near Kuwait and Iraq, and were not allowed to visit Mecca or Medina, which were hundreds of miles away.  But fear of outsiders in the region led people to believe the false claim that Muslim holy places were being desecrated.  I discussed these issues in GEC Counter-Disinformation Dispatch #13: Exploiting Primal Fears, published on January 13, 2022.

I find that the best way to address disinformation and misinformation is to have an expert spokesperson who has command of the subject matter, speak directly to the media, especially the mass media.  I have also found that the best way to prepare for and encourage inquiries is to write articles on such subjects for a website or other public venue, which publicizes the debunking and also helps to lock the key facts in one’s mind, so they can be more readily recalled when one is being interviewed.

In this and future Substack columns, I will explore these topics and explain how I used these techniques to counter false claims and how others might use them, depending on their wishes and the role they play.

Facts and Perceptions

There is often a difference between objective and subjective reality.  Facts are objective reality – a concrete detail that can be verified.

But what appears factual to one person may not appear so to others, as most of us know our experience.  Subjective views of reality can differ widely.

An economist noted a small, but interesting example in a column in the January 18, 2024 issue of The New York Times:

In 2002, Italian consumers were convinced inflation was running at 18 percent year over year, when the reality was 2 percent. Further investigation revealed that an increase in the price of a cup of espresso drove much of this erroneous impression.

Perceptions can differ much more dramatically on highly controversial topics.  For example, a poll of 8,000 people in 16 Arab countries from mid December 2023 to early January 2024 found that:

  • 67% of respondents reported that the military operation carried out by Hamas was a legitimate resistance operation, 19% reported that it was a somewhat flawed but legitimate resistance operation, and 3% said that it was a legitimate resistance operation that involved heinous or criminal acts, while 5% said it was an illegitimate operation.
  • 51% of respondents said that the policies of the United States are the most threatening [to the security and stability of the Arab region], followed by Israel with 26%, 7% of respondents said that Iranian policies are the most threatening and 4% said Russian policies.
  • Arab public opinion is almost unanimous in rejecting recognition of Israel, at a rate of 89%, up from 84% in 2022, compared to only 4% who support its recognition.

I am not aware of any comparable poll in Western countries on these issues, but I believe that the opinions on these issues would vary widely in different countries.

So, determining the facts surrounding an issue is, in many cases, the easiest part of countering mis/disinformation.  The difficult part is in convincing skeptical audiences about the validity of the facts.

“Frames Trump Facts”

Framing is a very important tool in countering disinformation and propaganda.  Cognitive scientist George Lakoff has written, “[i]n politics, whoever frames the debate tends to win the debate,” in his book Whose Freedom?: The Battle over America’s Most Important Idea.  (p. 12.)

Lakoff says additional interesting and provocative things about framing and how human thought works, including:

Most thought uses conceptual metaphors.  Metaphorical thought is normal and is used constantly, and we act on these metaphors.  …  Moral and political reasoning are highly metaphorical, but we are usually unaware of the metaphors we think with and live by.

Most thought does not follow the rules of logic.  … Thinking in frames and metaphors is normal ….  Political and economic reasoning uses frames and metaphors rather than pure laws of logic. ….

The frames and metaphors in our brains define common sense.  Commonsense reasoning is just the reasoning we do using the frames and metaphors in our brains. ….

Frames trump facts.  Suppose a fact is inconsistent with the frames and metaphors in your brain that define common sense.  Then, the frame or metaphor will stay and the fact will be ignored.  For facts to make sense they must fit existing frames and metaphors in the brain.  (Whose Freedom?, p. 13)

If Lakoff’s conclusions are correct, and I believe they are, then countering disinformation, misinformation, and propaganda involves much more than conducting extensive research, finding out what is true and what isn’t, and stating the facts clearly.  As I see it, research on what is true or not is the price of admission to the contest of countering false stories; it is not the same as winning the prize.  A great deal more is required.

In Search of Persuasive Truths

I am not a cognitive scientist like Lakoff, although decades of experience in countering disinformation have exposed me to the some of the ways in which human thought is less than perfectly rational.  One expert on human cognition whom I respect a great deal is cognitive neuroscientist and cultural anthropologist Dr. Robert Deutsch, or Dr. Bob, as he calls himself (pictured below).

Dr. Bob wrote, in 2018:

Humans are pattern makers, symbol makers, narrative makers, and everything we mentally construct is driven by the non-linear, emotionally-based processes of mind.  …

The problem is: information and rationality are puny in the face of belief. The mind evolved to act, not to think. And for the sake of speed, we humans can too quickly and easily default into “not-community.”

I first met Bob when we were both on a panel discussing propaganda in the 1991 Iraq war.  I recited the facts about Iraqi disinformation.  Bob then humbled and educated me by pointing out that facts are bloodless things and images of human suffering, which Iraqi propaganda focused on, have much more emotive power, as is readily apparent in the current HAMAS-Israel war.  I knew immediately that I needed to learn more from him.

He wrote in a recent email to me:

The main thing, cognitively speaking, is data are just cold abstractions. Metaphor provides a connection between abstraction and concreteness. Metaphors signify symbolic meanings, offer a wider range of human reaction as they are anti-literal. So, metaphor is a way to know and FEEL something in a different way — not via logic. Simply stated, the familiar helps us find meaning in the unfamiliar.

I have found that speaking to reporters (and other audiences) provides instant feedback on which points and topics have traction and which do not.  In my experience, giving media interviews is the best way to hone one’s communication skills and, in the process, help educate the public about what is true and what is not.

Elsewhere, Bob has written that all over the world people feel that “the third millennium is the world of TOO. We each now live in a context of too fast, too complex, and too competitive.”

People the world over are bombarded with enormous amounts of information, much of which is very difficult to evaluate.  And there are malevolent actors who produce disinformation and propaganda on an industrial basis as a matter of state policy, now with the aid of artificial intelligence.

In such a discomfiting situation, an expert spokesperson on misinformation and disinformation can provide a very helpful function by explaining what is true and what isn’t, and helping people understand the complexities involved, including the ways the mind can be fooled.

In doing this, one must first be very thorough and rational in finding and parsing the facts in one’s research.  Then, in communicating one’s findings, one must be sensitive to what Dr. Bob calls the “emotional logic” of the issues involved and address that, as best one understands it.

I believe this is the type of role that counter-misinformation/disinformation expert spokespersons should seek to play – not just that of a fact checker, which is the essential sine qua non, but a more expansive role that seeks to understand how people think and can educate people by reliably explaining what is true and what isn’t, and provide insight on why misunderstandings arise.

Of course, this is very difficult task, which requires constant, virtually endless effort and sensitivity to the views of various audiences, but I believe creating a team of expert “explainer” spokespersons who seek to clarify issues by explaining what is true and how misunderstandings, distortions, and different perspectives exist is an aspiration well worth striving for.  I see countering mis/disinformation as an honorable, noble task and one that is very much needed in today’s confusing, conflicted world.

Guest post by Todd Leventhal republished from his Substack with the permission of the author.