Interactive Resources to Inoculate Against Misinformation on Social Media

From the University of Cambridge’s Social Decision-Making Lab, Inoculation Science provides videos and games to illuminate the methods used to propagate false information via social media—and why we humans are vulnerable to such tricks.

Inoculation Videos

Truth Labs for Education, a collaboration of Cambridge University, University of Bristol, and Google Jigsaw, has created five short videos designed to help people resist unwanted persuasion online. The videos are rooted in a framework from social psychology called inoculation theory, a social–psychological communication theory that explains how an attitude or belief can be protected against persuasion or influence in much the same way a body can be protected against disease.

Each of the videos “inoculates” against a particular manipulation technique or misleading rhetorical device commonly encountered online: ad hominem attacks, emotional language that evokes fear or outrage, false dichotomies, incoherence, and scapegoating.

The five inoculation videos are effective at improving resilience to misinformation—but not forever: if you watch them only once, their effectiveness decays slowly over time. Therefore, short boostervideos are provided. Much like regular vaccines sometimes require top-ups to remain effective, the boosters serve as a top-up for the longer videos. So far, two boosters are available, both about the “emotional language” video.

Inoculation Games

The inoculation games were designed as part of a collaboration of DROG, Gusmanson Design, and Cambridge University to improve people’s ability to spot common manipulation techniques, such as trolling and creating conspiracy theories out of thin air. So far, three “prebunking” games, are provided. Each covers a different domain of misinformation: Bad News (online “fake news”), Harmony Square (political disinformation and intergroup polarization), and Go Viral (COVID-19 misinformation). More games are in development.

Share this post: