A study from the US found that belief in conspiracy theories is rife in health care. The investigators presented people with 6 different conspiracy theories, and the one that was most widely believed was the following:
THE FOOD AND DRUG ADMINISTRATION IS DELIBERATELY PREVENTING THE PUBLIC FROM GETTING NATURAL CURES FOR CANCER AND OTHER DISEASES BECAUSE OF PRESSURE FROM DRUG COMPANIES.
A total of 37% agreed with this statement, 31% had no opinion on the matter, and just 32% disagreed. What is more, the belief in this particular conspiracy correlated positively with the usage of alternative medicine.
The current popularity of so-called alternative medicine (SCAM) is at least partly driven by the conviction that there is a sinister plot by the FDA or more generally speaking ‘the establishment’ that prevents people from benefitting from the wonders of SCAM.
But where do those conspiracy theories come from?
How do they evolve?
A new article investigates these questions. Here is its abstract:
Although conspiracy theories are endorsed by about half the population and occasionally turn out to be true, they are more typically false beliefs that, by definition, have a paranoid theme. Consequently, psychological research to date has focused on determining whether there are traits that account for belief in conspiracy theories (BCT) within a deficit model. Alternatively, a two-component, socio-epistemic model of BCT is proposed that seeks to account for the ubiquity of conspiracy theories, their variance along a continuum, and the inconsistency of research findings likening them to psychopathology. Within this model, epistemic mistrust is the core component underlying conspiracist ideation that manifests as the rejection of authoritative information, focuses the specificity of conspiracy theory beliefs, and can sometimes be understood as a sociocultural response to breaches of trust, inequities of power, and existing racial prejudices. Once voices of authority are negated due to mistrust, the resulting epistemic vacuum can send individuals “down the rabbit hole” looking for answers where they are vulnerable to the biased processing of information and misinformation within an increasingly “post-truth” world. The two-component, socio-epistemic model of BCT argues for mitigation strategies that address both mistrust and misinformation processing, with interventions for individuals, institutions of authority, and society as a whole.
This makes a lot of sense to me, and it seems to apply well to the BCT in SCAM.
To mitigate BCT, the authors advocate asking:
- Who do you trust or mistrust and why?
- How do you decide what to believe?
Effective mitigation strategies, they state, may necessitate wholescale approaches that:
- confer resistance against BCT by utilizing inoculation strategies that counter misinformation where it occurs (e.g. online),
- teach analytic thinking within educational systems at an early age,
- restructure or otherwise impose restrictions on the digital architectures that distribute information in order to label or curb misinformation and promote “technocognition”.
These are no small challenges, and I am proud to say that, in the realm of SCAM, I am doing what I can to tackle them.