The idea that humans have mental immunity and can strengthen it is a promising line of thought in the fight against disinformation, misinformation, and polarization. Examples include the health risks a person runs when he chooses homeopathic treatment over an effective, scientifically sound medical treatment, the risk of social isolation of people who slip into the mindset of an absurd conspiracy theory, or the influence of lies and disinformation on election results and on tackling (inter)national conflicts.
The ability to think clearly and critically are crucial skills in a society that is flooded with information: factual, but unfortunately also false and untruthful information. The latter kind sneaks into our brains and affects our mental immunity, making us more and more prone to nonsense.
In this essay, I will try to explain how our brain's response to dissonant information hinders our rational thinking and what possibilities there are to strengthen our mental immunity from the domains of education and media.
Our Mental Immune System
As Andy Norman explains in his book Mental Immunity, our brain reacts with doubts, questions, and arguments when confronted with bad ideas. They are the antibodies that our brain deploys to protect itself against such ideas.1 I think it’s reasonable to say that there’s not only a cognitive mechanism at work but a physiological one as well. When we hear or read something that goes against our beliefs, our heart rate, blood pressure, and adrenaline levels rise. We experience anger because the other person is so unreasonable, or fear because our belief is not accepted. When those physiological alarm bells go off, people respond with a fight-or-flight response: they defend their belief with counter-arguments, provide more evidence that confirms their belief, or attack their opponent personally. Often people simply flee, because they cannot provide counterarguments, it seems pointless to go into a discussion, or they don’t want to waste time and energy on it.
But it also works in the other direction. People with unwarranted beliefs have this immune response too. They may react even more strongly to hearing what they perceive to be bad ideas from others because they are often very attached to their beliefs. People who believe in conspiracy theories, who think that Covid-19 is a hoax, or who are convinced that the earth is flat, tend to be a minority. Their convictions isolate them from the rest of society, but they have often built their identity on these strange beliefs. In addition, they often get these erroneous ideas from charismatic people who enjoy hero status because they dare to tell 'the truth', and dare to go against 'the elite'.2 People sometimes feel more connected to the person who is propagating these ideas than to the ideas themselves. Then, when rational-thinking person goes against their erroneous ideas, not only their beliefs are called into question, but also their heroes and their identities. Then it’s almost impossible to get these people out of that swamp.
Individual and Cultural Mental Immunity
It is undeniably helpful from an evolutionary point of view to incorporate good ideas and keep out bad ones. Factual, correct knowledge has survival value: knowing what you can and cannot eat, where there’s danger, and who you can trust, determined whether you survived or not. Then how can we explain that millions of people often have several completely wrong ideas?
I think this is only possible if the unwarranted beliefs are not immediately harmful to the host. You rarely hear that the Ebola virus is a fiction. Those infected with it have on average only a 50% chance of survival.3 Anyone who then believes that the government is making up an epidemic to control the population and doesn’t take protective measures is likely to die. The mortality rate for infected Covid-19 patients is a lot lower than for those with Ebola. Not believing in the existence of the Covid-19 virus has less fatal consequences. The same goes for the belief in a flat earth or the idea that the 9/11 attacks were an inside job.
Belief in false ideas may isolate people from the rest of society, but it doesn't have the same detrimental effect as being excluded from your tribe in a hunter-gatherer society. In addition, research has shown that the feeling of social exclusion pushes people even more towards conspiracy thinking and superstition.4 People simply build a new identity and (online) community with like-minded people. The more convinced they are of their vision, the more status and appreciation they receive from members of their new group.
But if irrational beliefs aren't immediately harmful to their host, should we even fight them?
The answer to that is undeniably “yes.” After all, individuals with unwarranted beliefs are part of a society, which can suffer a great deal: racism or misogyny causes harm in the targeted group, failure to vaccinate against measles may bring back a serious illness, and spreading false information has an impact on the fairness of elections, can disrupt democracy, and so on. Even if immunization against brain parasites comes too late for certain individuals, strengthening the mental immune system of society as a whole is of the utmost importance. So, what can we do to accomplish that?
Prebunking as a promising vaccine
The traditional way of dealing with misinformation is the debunking tactic: you refute unwarranted claims with facts. This method certainly has its merits: if no one were to react, it would appear that the incorrect information is accepted or has value. We’re morally obliged to correct falsehoods. You will not change the minds of the enthusiastic adherents of the false ideas by debunking them, but it might convince people who are uncertain or undecided.5 The biggest problem with debunking, of course, is that it always happens after erroneous information is already widespread. In situations where misleading information can already have an impact in the short term, for example in the event of an acute health crisis, elections, or referendums, debunking comes too late.
A relatively new method is prebunking. Media users get warnings that some information can be misleading, people are preemptively exposed to mild forms of misinformation and it is explained why this communication is misleading or incorrect. This makes it easier to recognize these “red flags” and makes them less susceptible to erroneous ideas. Recent studies all point in the same direction: prebunking appears to be an effective method of building mental immunity against misinformation.6,7
However, we have to take a number of things into account. Firstly, it’s impossible to anticipate all possible (future) fake news. Prebunking will therefore have to focus primarily on exposing misleading mechanisms instead of warning against specific, incorrect information. This generalizing approach is a strength because it’s applicable to many different types of misinformation, but it can also be a weakness when people don't recognize the misleading techniques in a different example. A person may be motivated to be critical of some subjects but not another, for instance, because they are too attached to a belief.
Secondly, prebunking can also lead to a bidding war against deception. The spreader of misinformation can also say that others will try to discredit his message through certain techniques, to which prebunkers in turn have to anticipate the tactics of the malicious sender, and so on. As with debunking, it can again become a question of "who's first?" to win the battle for the people's minds.
Finally, prebunking can also create resistance. Some argue that this technique wants to warn against manipulation by manipulating themselves. It is therefore very important who applies the technique. A video in which Hillary Clinton warns about the plans of right-wing extremists to influence the 2024 presidential election has sparked a lot of protest.8 By discrediting your opponent beforehand, you’re manipulating the democratic process yourself and questioning election results you might not like. Because she herself has interests in this case, her claim loses credibility. Such applications of the prebunking method can lead to resistance and thus to less effectiveness. Prebunking is therefore best done by impartial, reliable institutions.
The role of science and education
Worldwide, teachers, professors, and scientists are among the most trusted professions.9,10,11 They work in a non-profit domain, are not directly affiliated with political-ideological organizations, and the information they distribute usually only serves the public interest. Thus, academia and education play the most crucial role in building the mental immunity of society.
Universities can develop tools and methods for this and test their effectiveness. In teacher education, students must be trained to use these techniques in classroom practices at all levels of education. Therefore, the quality of the teacher matters a lot. Those who hold unfounded views themselves will not be able to teach children and youngsters the right skills.12
Once curricula are developed, teachers have about 12 to 15 years to build up mental immunity in their students. From an early age, children come into contact with all kinds of irrational and baseless beliefs, such as belief in the tooth fairy, Santa Claus, magical fairytale characters, and – if raised religiously – belief in a god, the devil, angels, the afterlife, creation and so on. Usually, children are not encouraged to critically question these (religious) beliefs, which weakens their mental immune system from the start. Schools should therefore implement rational, critical, and investigative thinking skills and attitudes as early as possible.
Scientists can examine the long-term effects of such an educational program and compare it with students who did not receive specific mental immunity training. This kind of research is necessary because current studies on prebunking only measure the effect in the short term and we see a diminishing of the impact of prebunking after a few months.13
After a school career with continuous attention to 'mental immune health', young people should be sufficiently resilient to unwarranted beliefs and misinformation.
The role of the media and social media platforms
It goes without saying that, in addition to teachers, journalists must also receive high-quality training and work with integrity. However, they have a significant disadvantage. They usually work for media companies that are trying to stay profitable in a highly competitive market. This can have an impact on how information is presented. It must capture the attention of the largest possible audience and for digital media, it is important that a news item is clicked, liked, shared, and retweeted as much as possible. The perception is that news media themselves have an interest in the type of information they offer. This may explain why there is (much) less trust in journalists than in educators.14
While most people trust traditional media such as news channels and newspapers more than social media, we are seeing a decline in traditional media use and an increase in social media use, especially among the younger generations. Social media platforms themselves indicate that they want to invest less and less in regular news and instead focus on offering content created by private users. This completely eliminates the gatekeeping function of traditional media. Anyone can distribute any kind of news without a guarantee of quality or without reliability checks.15
In addition, people who have little faith in the news and hold unwarranted beliefs are more likely to use social media.16,17,18
Research shows that potentially harmful content, such as hate speech, misinformation, and highly polarizing content, is quickly shared via these platforms. Facebook and Twitter apply fact-checking (debunking), but sometimes only days after the distribution peak of the misinformation.19
During the Covid-19 pandemic, Facebook, Twitter, and YouTube already offered a link to the CDC website when people searched their platforms for information about the coronavirus, but what else can social media platforms do to prevent the spread of misinformation and fake news? One of the most promising possibilities is the use of “accuracy prompts”. When users want to share a message, they are asked whether the information is reliable, if they are sure that they want to share this content, or if they are shown a video with tips on how to assess the reliability of the information. Studies show that using this method can reduce the willingness to share misinformation by as much as 20%.20,21
The more people act on their convictions, the stronger and less changeable that conviction becomes. Getting them to think about the trustworthiness of a post before liking, sharing, or commenting on it is an important step in weeding out misinformation on social media.
The possibilities of mental immunity theory
Given the importance of accurate and rational views for each individual and society in general, building mental immunity is crucial. It will therefore require continuous dedication from many actors, in particular scientists, journalists, policymakers, and educators. Because many have financial or ideological interests in spreading misinformation, I believe that education will play a central role in strengthening the mental resilience of current and future generations. A small research project to reduce irrational beliefs in my own students—one involving the teaching of scientific methods, deconstructing pseudoscience, and critical thinking exercises—shows some positive results.
 Norman, A. Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think. (Harper Wave: 2021): 33