Skip to main content

When is a conspiracy theory not a conspiracy theory?

There seems to be increasing support for the idea that the SARS-Cov-2 virus (Covid for short) pandemic started as a result of a (probably accidental) leak from the Wuhan laboratory. While the authorities are expressing low confidence in whether or not it's true, there is an acceptance from the likes of the FBI that it is the most likely cause.

What's worrying about this is not that the scientific viewpoint has changed. Changing your theories to reflect new data is a fundamental of science. In fact one of the two biggest problems science has in general with switching to a new theory is not that views alter, but rather that many scientists who build their careers on a particular theory are reluctant to change their minds, even when the evidence becomes strong that an alternative theory is now the best supported by the evidence. (The other problem is that those who don't understand science, particularly in the media, see a change of mind as weakness rather than the strength that it is.) No, what's worrying about the lab leak theory is the way that it has been treated by leading scientists.

It was fine to doubt the lab leak theory and point out what was wrong with it - but many high profile scientists labelled it a 'conspiracy theory'. And that was a serious error.

Calling something a conspiracy theory is a term of disapprobation, suggesting that those who hold it are at best misguided and at worst idiotic. Let's compare a few conspiracy theories with the lab leak hypothesis. Classic conspiracy theories include the flat Earth, the idea that the moon landings were faked and the suggestion that world leaders (and the British royal family) are intelligent lizards in human suits. What makes these conspiracy theories is that there is strong evidence that the theory is not true, but no good evidence that the theory could be true. In many cases, it's also the case that there is no particular benefit to be gained from all the effort that would be required to maintain the conspiracy. (Admittedly, the intelligent lizards would see a benefit in keeping their existence quiet.)

The lab leak theory was quite different. There was no strong evidence either way. There was some evidence for both the wet markets and a lab leak as the source of Covid, but nothing definitive. But it really shows a poor understanding of human nature to say, without good evidence, that a virus that originated in Wuhan, where there is a laboratory experimenting on that kind of virus, couldn't have come from the lab.

Rather than basing the action on any science, suppressing the lab leak theory was considered important for political reasons - in a sense, this action genuinely was a conspiracy by the authorities: the kind of suppression of information that can sometimes be necessary in times of national emergencies, as is arguably the case with wartime propaganda. Whether it really was necessary here is extremely doubtful - but the mistake was to label lab leak a conspiracy theory, rather than one with limited evidence to support it. 'We just don't know: we shouldn't speculate until we have better evidence' would have been a far better line than 'it came from wet markets and to suggest otherwise is a conspiracy theory.'

Interestingly, when I was thinking of examples of real conspiracy theories, one that doesn't fit the mould I mention above quite so well is the murder of JFK. Here again, the evidence of exactly what happened isn't intensely strong - so it is perhaps over-reacting to call alternative views of what happened conspiracy theories. We can say the Lee Harvey Oswald version has somewhat better evidence than alternative theories, but it's a topic that is never like to have good enough detailed data to have a definitively supported theory of what happened.

Now, unfortunately, those scientists who leapt to the conspiracy theory label are receiving a backlash - and it's science that suffers as a result, because those with an anti-science agenda can use this as a weapon against the scientific community. It's hard to say 'I don't know', especially if the media are calling you an expert. But, arguably, things would be better if more in the science community said it more often.

Image by Ben White from Unsplash.

See all of Brian's online articles or subscribe to a weekly digest for free here


Popular posts from this blog

Why I hate opera

If I'm honest, the title of this post is an exaggeration to make a point. I don't really hate opera. There are a couple of operas - notably Monteverdi's Incoranazione di Poppea and Purcell's Dido & Aeneas - that I quite like. But what I do find truly sickening is the reverence with which opera is treated, as if it were some particularly great art form. Nowhere was this more obvious than in ITV's recent gut-wrenchingly awful series Pop Star to Opera Star , where the likes of Alan Tichmarsh treated the real opera singers as if they were fragile pieces on Antiques Roadshow, and the music as if it were a gift of the gods. In my opinion - and I know not everyone agrees - opera is: Mediocre music Melodramatic plots Amateurishly hammy acting A forced and unpleasant singing style Ridiculously over-supported by public funds I won't even bother to go into any detail on the plots and the acting - this is just self-evident. But the other aspects need some ex

Is 5x3 the same as 3x5?

The Internet has gone mildly bonkers over a child in America who was marked down in a test because when asked to work out 5x3 by repeated addition he/she used 5+5+5 instead of 3+3+3+3+3. Those who support the teacher say that 5x3 means 'five lots of 3' where the complainants say that 'times' is commutative (reversible) so the distinction is meaningless as 5x3 and 3x5 are indistinguishable. It's certainly true that not all mathematical operations are commutative. I think we are all comfortable that 5-3 is not the same as 3-5.  However. This not true of multiplication (of numbers). And so if there is to be any distinction, it has to be in the use of English to interpret the 'x' sign. Unfortunately, even here there is no logical way of coming up with a definitive answer. I suspect most primary school teachers would expands 'times' as 'lots of' as mentioned above. So we get 5 x 3 as '5 lots of 3'. Unfortunately that only wor

Which idiot came up with percentage-based gradient signs

Rant warning: the contents of this post could sound like something produced by UKIP. I wish to make it clear that I do not in any way support or endorse that political party. In fact it gives me the creeps. Once upon a time, the signs for a steep hill on British roads displayed the gradient in a simple, easy-to-understand form. If the hill went up, say, one yard for every three yards forward it said '1 in 3'. Then some bureaucrat came along and decided that it would be a good idea to state the slope as a percentage. So now the sign for (say) a 1 in 10 slope says 10% (I think). That 'I think' is because the percentage-based slope is so unnatural. There are two ways we conventionally measure slopes. Either on X/Y coordiates (as in 1 in 4) or using degrees - say at a 15° angle. We don't measure them in percentages. It's easy to visualize a 1 in 3 slope, or a 30 degree angle. Much less obvious what a 33.333 recurring percent slope is. And what's a 100% slope