Home > General > How to deal with misinformation in the medical world

How to deal with misinformation in the medical world

Presented by
Tom van Bommel, Unravel Research, the Netherlands
Conference
DDD 2025
Doi
https://doi.org/10.55788/c1d1394a
Misinformation is ubiquitous, and medical professionals struggle to manage the harmful effects that influencers and other sources of misinformation have on their patients. Understanding the psychological mechanisms behind misinformation can help us guide patients back toward evidence-based medicine.

“Our brains are lazy,” psychologist Tom van Bommel (Unravel Research, the Netherlands) started his talk on the psychology behind misinformation [1]. “Most of the time, the brain tries to make decisions with as little effort as possible. Over time, it has developed various decision-making shortcuts, which, although often useful, may also lead to poor judgment.” One unfortunate consequence of how we process information is the adoption of misinformation. In the medical world, healthcare practitioners often encounter a wall of misinformation when dealing with patients. Is there a way to guide patients and their networks to embrace accurate information about their illnesses?

Van Bommel mentioned 2 key factors that make patients particularly susceptible to misinformation. First, a perceived loss of control; second, being confronted with a shocking message or life event that causes significant changes. “A typical example is a job loss,” according to van Bommel. “During times like these, people are more inclined to fall for conspiracy theories.” Importantly, countering the perceived loss of control seems to restore the insecurity and mental imbalance a person experiences. Many social media influencers provide simple solutions to patients in search of control. By following their advice, patients regain a sense of stability. “However, the reality is often more nuanced and not as straightforward as these influencers portray,” said van Bommel.

The causes behind adopting misinformation were explored in more detail by van Bommel, who shared 3 cognitive biases that are particularly relevant to this matter. First, confirmation bias encompasses the fact that individuals are more likely to seek evidence that aligns with their current beliefs or opinions rather than evidence that contradicts them. “This is one of the biggest drivers of misinformation and conspiracy theories,” said van Bommel. In the current digital age, finding information that aligns with personal views has become easier. Even AI-driven tools such as ChatGPT are vulnerable to confirmation bias. The questions posed to ChatGPT are frequently designed to confirm information, such as asking, ‘Is there research that supports statement X?’. According to van Bommel, ChatGPT may find evidence for the statement, but it does not necessarily mean it is correct. One solution to counteract confirmation bias is through discussions within social groups. Bringing together people with various opinions to discuss an issue can help uncover the truth. “However, in today’s society, people tend to remain within their own bubble of like-minded people,” van Bommel added. He further emphasised that it is important to realise that patients often come to clinical practice with misinformation-based beliefs anchored into their identity. “Avoid being judgmental, show interest in their (mis)information, and inquire about the sources of information,” he advised. If a physician immediately offers a counterargument, the patient will likely reject it. Additionally, it is helpful to first ask for patients’ permission to share alternative information, ensuring they will be more committed to the information provided.

A second bias mentioned by van Bommel is the so-called negativity bias. Negatively formulated information draws more attention and is more often perceived as true. A common trait of conspiracy theories and misinformation is their strongly negative framing. To counter such narratives, van Bommel suggests that it may be more effective to reframe the discussion using equally negative, but accurate information. “For example, to counter a negative misconception, such as the damaging effects of sunscreen, it may be better to explain the harmful effects of the sun rather than the positive effects of sunscreen,” explained van Bommel. The brain is more susceptible to adopting negatively formulated information than positively formulated one.

The third bias is called credibility boosters. Van Bommel explained that about 95% of decisions are made without active awareness, through automated, quick, emotional information processing, whereas only approximately 5% of the decisions are, in fact, based on conscious, logical decision-making. So-called credibility boosters may easily manipulate the unconscious system. Repetition, authority, correlation-ascausation, linguistic fluency, the use of images in a message, and story-based information act as credibility boosters in much of our decision-making. Medical influencers frequently use all these credibility boosters. However, as van Bommel pointed out, healthcare professionals can also apply them in their communication with patients. “For example, if we want to explain the benefits of sunscreen to patients, we might use a story-based explanation, such as comparing sunscreen to a shield that blocks harmful rays. Without it, the skin has no protection, much like going into battle without armour,” said van Bommel.

At the end of his talk, van Bommel outlined 3 easy-to-use methods that may help healthcare practitioners dismantle misinformation-based belief systems. First, he advised asking patients to what extent they believe a source of information is accurate. This will activate the conscious and critical decision-making modules of the brain. Second, he suggested using the so-called myth sandwich technique to debunk information. This comprises first communicating the accurate state of affairs, then addressing the myth, and finally repeating the correct information. For example, when addressing concerns about topical steroids, one might begin by affirming their effectiveness, briefly acknowledging common fears, and conclude by emphasising their safety and benefits. “Patients tend to be more willing to adopt new information right after the myth is debunked, and that is why we need to repeat the true statement,” clarified van Bommel. The third strategy is prebunking, which is more effective than debunking, according to van Bommel. It consists of proactively informing patients about common myths before they encounter them. This approach is best used during calm moments and helps build resilience to misinformation in case patients encounter it later, creating a so-called inoculation effect.

Van Bommel concluded on a positive note. “Research has shown that inoculation works well against misinformation [2]. The more patients know about cognitive biases, the less susceptible they are to fall for misinformation. It would be a good idea to teach high-school students about these biases in order to prepare them better for the pitfalls of misinformation.”

  1. Van Bommel T. The psychology behind misinformation. Dermatologendagen 2025, 10–11 April, Apeldoorn, the Netherlands.
  2. Rozenbeek J, et al. Sci Adv. 2022;8(34):eabo6254.

Copyright ©2025 Medicom Medical Publishers



Posted on