In this article, I step outside my usual writing domain, which typically explores Chinese Medicine or Sinology. However, since this topic remains within the broader domain of health—a field directly related to Chinese Medicine—I was compelled to complete this piece. I hope it contributes to expanding our collective knowledge and leaves a positive impact on all of us.
There are many tutorials circulating on social media whose accuracy should, in fact, be questioned, researched, and critically examined. Yet, many people are swayed to follow them without further thought or investigation.
Not all health advice on social media is wrong or harmful. Much of it is indeed useful and beneficial, especially when presented by legitimate professionals in the health field. However, a substantial amount of it is incorrect—or even completely false. Ironically, many audiences end up following the wrong advice instead of the correct guidance they should be following.
And this issue is not exclusive to conventional medicine. It is also frequently seen within Chinese Medicine.
Following the Wrong Advice
In treatments labeled “traditional,” such as TCM, these kinds of claims often appear:
• “If you have back pain, take XXX, and you will be cured.”
• “So, you have a chronic illness? No need to see a healthcare professional—just do YYY, and you’ll definitely recover.”
I frequently come across such statements while scrolling through social media.
What results do those who follow this advice actually experience? The outcomes vary. Some see no improvement at all—none of the results they had hoped for. Some worsen. Some even develop new health problems. And yet, there are also those who claim to have recovered completely.
So, does that mean we should follow such health tips? How could we not, if they seem to provide healing?
But perhaps another question emerges:
Aren’t there other possibilities besides full recovery? Such as no benefit at all—or worse, the condition deteriorating, or entirely new illnesses developing?
What if, instead of receiving the hoped-for positive outcomes, we become trapped in negative ones? After all, we cannot control which outcome we receive.
This becomes a serious risk when health tips are followed blindly, without a critical mindset.
Scientific Research Explains Why Audiences Are Prone to Follow False Health Tips
In a paper titled “Misinformation as a Misunderstood Challenge to Public Health” published in the American Journal of Preventive Medicine, Southwell et al. argue that misinformation spreads through social, cognitive, and technological pathways, making it increasingly difficult to control. The paper emphasizes that misinformation persists not merely due to ignorance, but because of the interplay between human psychology, social structures, and digital media dynamics.
In addition, individuals with low health literacy tend to overestimate their understanding, rendering them more susceptible to false claims. This phenomenon is known as the Dunning-Kruger Effect.
Negativity Bias also plays a role. Misinformation rooted in fear—such as “This food causes cancer!”—spreads rapidly because the human brain is wired to prioritize threats.
Another factor is the Illusory Truth Effect, wherein repetition makes false claims seem true. For instance, “Vitamin C cures COVID” was repeated so often that many people came to believe it.
Clearly, false health claims persist and propagate through a multitude of channels—social, cognitive, and technological. Let us first examine the cognitive pathways: how the human brain itself facilitates misinformation.
I. Cognitive Pathways
The brain relies on shortcuts known as cognitive heuristics to process information rapidly. But these mental shortcuts come with trade-offs—they also make people more likely to accept and spread misinformation.
Cognitive heuristics are typically divided into two types:
- Availability Heuristic: People assess the truth of information based on how easily examples come to mind.
- Affect Heuristic: Content that evokes strong emotions (e.g., anger, fear, hope) overrides rational analysis.
Further complicating matters, individuals often remember a claim while forgetting its source, detaching the misinformation from its questionable origins. This is known as source amnesia.
II. Social Pathways
Now we move to the question: How do communities spread misinformation?
Our social environments heavily shape how we think and make decisions. In the case of health misinformation, individuals who participate in “like-minded groups” (whether online or offline) tend to absorb whatever information is shared within that group without filtering it—simply because it aligns with their pre-existing beliefs. When such messages are delivered by close friends, family members, or trusted peers, the belief in them grows stronger.
Moreover, people often trust and follow influencers on social media without questioning whether the information shared is accurate or not.
Interestingly, many individuals share these messages as acts of goodwill, believing they are helping others. Their intentions are noble. However, our kindness should not lead to harmful outcomes.
III. Technological Pathways
Technology accelerates the speed and reach of misinformation—an effect unimaginable in the past. And with time, it may grow even more powerful.
This is largely due to what’s called Algorithmic Amplification.
Technology, including algorithms, is a double-edged sword. Personally, I value technology greatly—it has allowed many things to be realized that were once impossible. However, the same algorithms that help us also enable misinformation to flourish.
As we know, social media platforms prioritize engagement—likes, shares, and comments. This model, while effective for visibility, tends to favor content that is sensational and emotionally provocative. The human brain naturally gravitates toward this type of content.
Compare these two titles:
- “Availability: A Heuristic for Judging Frequency and Probability”
- “Cognitive Heuristics Make People Trapped in Misinformation and Easily Brainwashed!”
Setting aside the accuracy of the titles, which one stirs more emotion? The second, undoubtedly. It activates the Affect Heuristic, provoking reactions and engagement—which is precisely how misinformation spreads.
Then comes the issue of filter bubbles: algorithms begin feeding users more of what they already believe. This deepens the echo chamber effect. If someone initially believes a false claim, then sees it repeatedly validated, how could they not believe it more strongly?
When the cognitive, social, and technological pathways converge, they form a vicious cycle:
- Cognitive biases make individuals vulnerable.
- Social networks validate and reinforce false beliefs.
- Technology ensures misinformation reaches more people, faster.
Any one of these pathways can independently lead someone to believe misinformation. Together, they become even more formidable. Quite remarkable, isn’t it?
But That Doesn’t Mean There’s No Way Out
Southwell et al. not only described how misinformation spreads but also proposed solutions.
One effective strategy is to boost health literacy and critical thinking skills.
Their study shows that individuals with low health literacy are most easily deceived—this aligns with the Dunning-Kruger Effect, in which those with limited competence overestimate their understanding, while true experts often underestimate theirs. Have you ever encountered someone from outside the health profession who speaks theatrically and assertively about health—as though they know more than actual professionals? Conversely, you may notice that credible medical practitioners tend to communicate calmly, methodically, and cautiously. Ironically, it is often the overconfident non-experts who gain more attention than qualified experts, who are dismissed as “elitist.”
Based on this, we must reflect: immediately rejecting any content on social media is also not the right approach.
We must avoid being gullible, but we must also avoid rejecting information harshly—especially when emotionally charged language is involved. In a polarized environment, harsh rejection can trigger unintended effects:
- The Streisand Effect: Attempts to suppress misinformation can cause it to spread more widely.
- Persuasion Irony: Strongly framed warnings can attract more believers (e.g., labeling something as “forbidden knowledge” can increase its allure among conspiracy theorists).
Example case:
- Misinformation: “XXX causes cancer.”
- Rebuttal: “This is dangerous nonsense!”
This response is ineffective.
Why? Because those who already believe the claim interpret the rebuttal as censorship—strengthening their belief and identity as “truth-seekers.”
Therefore, filtering misinformation must be done gracefully. We must persuade gently—even ourselves—when we suspect we may have absorbed falsehoods. Do not accept blindly. But also, do not reject aggressively. Instead, analyze. Investigate the facts. Evaluate the source. Is the “study” peer-reviewed?
Remember, there is a wealth of accurate, helpful information on social media—shared by credible professionals. It would be a shame to dismiss it all.
Perhaps some of you would like to share your thoughts? Feel free to contribute in the comments with reflections that are positive, thoughtful, and constructive. 😊
Disclaimer:
This article is for informational and educational purposes only and does not constitute medical advice, diagnosis, or treatment. The views expressed reflect an analysis of how health misinformation spreads across social, cognitive, and technological domains. Readers are encouraged to consult licensed healthcare professionals before making any health-related decisions. The author does not endorse or condemn any specific health practices discussed, but rather invites critical thinking and reflection.
References:
1. Southwell, B. G., Thorson, E. A., & Sheble, L. (2018). Misinformation as a misunderstood challenge to public health. American Journal of Preventive Medicine, 55(2), 162–169. https://doi.org/10.1016/j.amepre.2018.04.010
2. Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. https://doi.org/10.1016/0010-0285(73)90033-9
3. Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2007). The affect heuristic. European Journal of Operational Research, 177(3), 1333–1352. https://doi.org/10.1016/j.ejor.2005.04.006
4. Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330. https://doi.org/10.1007/s11109-010-9112-2
5. Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131. https://doi.org/10.1177/1529100612451018
6. Jolley, D., & Douglas, K. M. (2017). Prevention is better than cure: Addressing anti-vaccine conspiracy theories. British Journal of Social Psychology, 56(3), 455–470. https://doi.org/10.1111/bjso.12158
7. Dunning, D., & Kruger, J. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. https://doi.org/10.1037/0022-3514.77.6.1121