You Won’t Believe This: Researchers are Trying to “Inoculate” People Against Misinformation

Kai Kupferschmidt in Science: As a young boy growing up in the Netherlands in the 1990s, Sander van der Linden learned that most of his mother’s relatives, who were Jewish, had been killed by the Nazis, in the grip of racist ideology. At school, he was confronted with antisemitic conspiracy theories still circulating in Europe. It all got him wondering about the power of propaganda and how people become convinced of falsehoods.

Eventually, he would make studying those issues his career. As head of the Social Decision-Making Lab at the University of Cambridge, Van der Linden is studying the power of lies and how to keep people from believing them. He has become academia’s biggest proponent of a strategy pioneered after the Korean War to “inoculate” humans against persuasion, the way they are vaccinated against dangerous infections.

The recipe only has two steps: First, warn people they may be manipulated. Second, expose them to a weakened form of the misinformation, just enough to intrigue but not persuade anyone. “The goal is to raise eyebrows (antibodies) without convincing (infecting),” Van der Linden and his colleague Jon Roozenbeek wrote recently in JAMA.

Inoculation, also called “prebunking,” is just one of several techniques researchers are testing to stop people from falling for misinformation and spreading it further. Others have focused on fact checking and debunking falsehoods, educating people about news sources’ trustworthiness, or reminding people periodically to consider that what they’re reading may be false. But Van der Linden has captured public imagination in a way few others have, perhaps because the concept is so seductively simple. “It’s definitely the one that has gotten most attention,” says Lisa Fazio, a psychologist at Vanderbilt University.

Van der Linden’s 2023 book, Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity, has won many awards, and Google’s research arm, Jigsaw, has rolled out the approach to tens of millions of people via YouTube ads. “My reading of the literature is that it’s probably the most effective strategy,” says Jay van Bavel, a psychologist at New York University.

But others say inoculation is an analogy gone awry that wrongly focuses on recipients of misinformation instead of on its sources and the social media companies—such as X (formerly Twitter), Facebook, and TikTok—that enable and profit from its spread. “I think this metaphor is very limiting in how we understand where the problem really lies,” says Sandra González-Bailón, a social scientist at the University of Pennsylvania. “It’s easier to do than dealing with the systemic issues, but it puts all the pressure on the individual.”

More here.

Leave a Reply

Your email address will not be published. Required fields are marked *