Skip to main content

Where's quantum Wally?

It's appropriate that the episode of The Big Bang Theory I watched last night featured as part of a kind of nerd Olympics a competitive game of Where's Wally (or to be precise, the US variant of the book Where's Waldo? - why did they change the name?) where contestants were handicapped by playing without their glasses. There's something very Where's Wally? like about my topic today, which is the puzzle of where a quantum particle like an electron or a photon is when you aren't looking at it.

Here's the thing. Unless you observe it and pin it down, a quantum particle's location is fuzzy. The position is described by Schrödinger equation, which tells you the probability of it being in any location, but this isn't like saying I can give a probability for where I am in the house, because in practice I actually will be in one, specific place at any one time. In the case of the quantum particle the probability is all there is. The best we can say is the particle literally doesn't have a fixed location, we can only say that there are various probabilities of where we would find it if we looked.

Young's slits
So far so good. This fuzziness of location is important, because it explains why it is easy to think that quantum particles are, in fact, waves. The classic example is Young's slits. A couple of hundred years ago, Thomas Young set light through a pair of slits so that the beams merged on a screen behind them. The result was a series of light and dark bands. This was used to show that light is a wave, because waves 'interfere' with each other. If two waves meet and they are both rippling up at the same point, the result will be a bigger wave. If one is rippling up and the other down, they will cancel each other out. And for light this would produce those dark and light bands.

The only thing is, we now know that light can be described as quantum particles - photons. And even if you send those one at a time through Young's slits, the interference pattern still builds up. The same goes for other quantum particles like electrons, which produce exactly the same effect.

The reason I bring this up now is I am currently reading for review a book called The Quantum Divide, which is about half way between a popular science book and a student text, and which takes some offence at the way this quantum strangeness is often portrayed. What popular science books frequently say, and I think I have been guilty of this, is that a particle goes through both slits (i.e. is in more than one place at a time) and interferes with itself. Gerry and Bruno (I'm not being overly familiar, these are surnames), the authors of the book, take serious umbrage at this wording:
We hasten to emphasize that quantum mechanics does not actually say that an electron can be in two places at once, hence the use of the proviso that quantum mechanics only superficially appears to allow the electron to be in two places at once.
And just to be clear why they are having to make this distinction:
Quantum theory does not predict that an object can be in two or more places at once. The false notion to the contrary often appears in the popular press, but is due to a naïve interpretation of quantum mechanics.
(Their emphasis in both cases.) In one way I am very grateful to G&B as I will be more careful with my wording as a result of this. But on the other hand, I think this typifies how scientists trying to present science to the general public can get a bad press. They no doubt think what they are doing is emphasizing their precision, but this comes across as the worst kind of academic bitching. More seriously, I think G&B are in danger of throwing the baby out with the bathwater. All descriptive models of something as counter-intuitive as quantum theory are inevitably approximations - what they are really doing here is not liking someone else's language, even though it gets the basic point across better than their version in some ways.

The idea that, say, a photon goes through both slits and interferes with itself is technically inaccurate. The photon is not in any place with certainty, but is only described by the wave equation, which gives it a probability of being located in each slit. And the interference is of that probability wave, not the photon itself. However it is arguable that the probability wave is the photon - that it is the only meaningful description of the photon and as such if we say that the probability wave has values at both slits and interferes with itself, we are surely not stretching things too far to replace the clumsy 'the probability wave' with 'the photon'.

Okay, it's not perfectly accurate, and we certainly should explain what we are doing and probably often fail to do so. But I don't think this is any more a problem than when physicists speak of the big bang or dark matter as if it they are facts, rather than our current best accepted theories. To make a big thing of it as G&B do is, frankly, to miss the point.


Popular posts from this blog

Why I hate opera

If I'm honest, the title of this post is an exaggeration to make a point. I don't really hate opera. There are a couple of operas - notably Monteverdi's Incoranazione di Poppea and Purcell's Dido & Aeneas - that I quite like. But what I do find truly sickening is the reverence with which opera is treated, as if it were some particularly great art form. Nowhere was this more obvious than in ITV's recent gut-wrenchingly awful series Pop Star to Opera Star , where the likes of Alan Tichmarsh treated the real opera singers as if they were fragile pieces on Antiques Roadshow, and the music as if it were a gift of the gods. In my opinion - and I know not everyone agrees - opera is: Mediocre music Melodramatic plots Amateurishly hammy acting A forced and unpleasant singing style Ridiculously over-supported by public funds I won't even bother to go into any detail on the plots and the acting - this is just self-evident. But the other aspects need some ex

Is 5x3 the same as 3x5?

The Internet has gone mildly bonkers over a child in America who was marked down in a test because when asked to work out 5x3 by repeated addition he/she used 5+5+5 instead of 3+3+3+3+3. Those who support the teacher say that 5x3 means 'five lots of 3' where the complainants say that 'times' is commutative (reversible) so the distinction is meaningless as 5x3 and 3x5 are indistinguishable. It's certainly true that not all mathematical operations are commutative. I think we are all comfortable that 5-3 is not the same as 3-5.  However. This not true of multiplication (of numbers). And so if there is to be any distinction, it has to be in the use of English to interpret the 'x' sign. Unfortunately, even here there is no logical way of coming up with a definitive answer. I suspect most primary school teachers would expands 'times' as 'lots of' as mentioned above. So we get 5 x 3 as '5 lots of 3'. Unfortunately that only wor

Which idiot came up with percentage-based gradient signs

Rant warning: the contents of this post could sound like something produced by UKIP. I wish to make it clear that I do not in any way support or endorse that political party. In fact it gives me the creeps. Once upon a time, the signs for a steep hill on British roads displayed the gradient in a simple, easy-to-understand form. If the hill went up, say, one yard for every three yards forward it said '1 in 3'. Then some bureaucrat came along and decided that it would be a good idea to state the slope as a percentage. So now the sign for (say) a 1 in 10 slope says 10% (I think). That 'I think' is because the percentage-based slope is so unnatural. There are two ways we conventionally measure slopes. Either on X/Y coordiates (as in 1 in 4) or using degrees - say at a 15° angle. We don't measure them in percentages. It's easy to visualize a 1 in 3 slope, or a 30 degree angle. Much less obvious what a 33.333 recurring percent slope is. And what's a 100% slope