It's appropriate that the episode of
The Big Bang Theory I watched last night featured as part of a kind of nerd Olympics a competitive game of
Where's Wally (or to be precise, the US variant of the book
Where's Waldo?  why did they change the name?) where contestants were handicapped by playing without their glasses. There's something very
Where's Wally? like about my topic today, which is the puzzle of where a quantum particle like an electron or a photon is when you aren't looking at it.
Here's the thing. Unless you observe it and pin it down, a quantum particle's location is fuzzy. The position is described by Schrödinger equation, which tells you the probability of it being in any location, but this isn't like saying I can give a probability for where I am in the house, because in practice I actually will be in one, specific place at any one time. In the case of the quantum particle the probability is all there is. The best we can say is the particle literally doesn't have a fixed location, we can only say that there are various probabilities of where we would find it if we looked.

Young's slits 
So far so good. This fuzziness of location is important, because it explains why it is easy to think that quantum particles are, in fact, waves. The classic example is Young's slits. A couple of hundred years ago, Thomas Young set light through a pair of slits so that the beams merged on a screen behind them. The result was a series of light and dark bands. This was used to show that light is a wave, because waves 'interfere' with each other. If two waves meet and they are both rippling up at the same point, the result will be a bigger wave. If one is rippling up and the other down, they will cancel each other out. And for light this would produce those dark and light bands.
The only thing is, we now know that light can be described as quantum particles  photons. And even if you send those one at a time through Young's slits, the interference pattern still builds up. The same goes for other quantum particles like electrons, which produce exactly the same effect.
The reason I bring this up now is I am currently reading for review a book called
The Quantum Divide, which is about half way between a popular science book and a student text, and which takes some offence at the way this quantum strangeness is often portrayed. What popular science books frequently say, and I think I have been guilty of this, is that a particle goes through both slits (i.e. is in more than one place at a time) and interferes with itself. Gerry and Bruno (I'm not being overly familiar, these are surnames), the authors of the book, take serious umbrage at this wording:
We hasten to emphasize that quantum mechanics does not actually say that an electron can be in two places at once, hence the use of the proviso that quantum mechanics only superficially appears to allow the electron to be in two places at once.
And just to be clear why they are having to make this distinction:
Quantum theory does not predict that an object can be in two or more places at once. The false notion to the contrary often appears in the popular press, but is due to a naïve interpretation of quantum mechanics.
(Their emphasis in both cases.) In one way I am very grateful to G&B as I will be more careful with my wording as a result of this. But on the other hand, I think this typifies how scientists trying to present science to the general public can get a bad press. They no doubt think what they are doing is emphasizing their precision, but this comes across as the worst kind of academic bitching. More seriously, I think G&B are in danger of throwing the baby out with the bathwater. All descriptive models of something as counterintuitive as quantum theory are inevitably approximations  what they are really doing here is not liking someone else's language, even though it gets the basic point across better than their version in some ways.
The idea that, say, a photon goes through both slits and interferes with itself is technically inaccurate. The photon is not in any place with certainty, but is only described by the wave equation, which gives it a probability of being located in each slit. And the interference is of that probability wave, not the photon itself. However it is arguable that the probability wave
is the photon  that it is the only meaningful description of the photon and as such if we say that the probability wave has values at both slits and interferes with itself, we are surely not stretching things too far to replace the clumsy 'the probability wave' with 'the photon'.
Okay, it's not perfectly accurate, and we certainly should explain what we are doing and probably often fail to do so. But I don't think this is any more a problem than when physicists speak of the big bang or dark matter as if it they are facts, rather than our current best accepted theories. To make a big thing of it as G&B do is, frankly, to miss the point.