Skip to main content

How big is the moon?

A while ago I had a slightly surreal discussion with a radio presenter and the boss of the UK Space Agency. Us two science types were trying to educate said presenter in one or two basics of solar system science. At one point I asked the presenter a simple question. If you were looking at the full moon and held a coin out at arms length, which coin would be roughly similar in size to the apparent size of the moon?

What would you answer?

The most popular answers are 2p and 10p. Some opt for 5p. In fact it's a bit of a trick question. There are no coins small enough to be the same apparent size as the moon when held at arm's length. The nearest approximation is a hole punch hole. If you hold a piece of punched paper out at arms length, that little hole is about the same size as the moon appears to be.

This explains why photographs of the moon are so disappointing unless you apply a serious zoom. The photo here is a hurried snap taken on my phone because you don't often see the moon and a rainbow simultaneously. The (pretty well full) moon is that little white dot just above the bus shelter and to the right of the centralish tree. The camera tells us what our brain doesn't - it really does look small.

So why does it look bigger - in fact sometimes quite enormous (though never as big as Hollywood suggests)? This is where the UK Space Agency supremo and I deviate. He said it was big near the horizon because of optical effects due to the light passing through more atmosphere. There certainly are optical effects of this kind, but the influence on the apparent size is relatively small. Almost all the variation in the size of the moon (and it pretty well always looks bigger than it should) is down to the way our brain processes images.

We are so familiar with cameras, we tend to think that the eye/brain combo works like a camera. It doesn't. The image that we 'see' is a composite assembled by the brain from a whole host of processes. It is a fake construct. This should be obvious, because we don't see the blind spot, where the optic nerve renders part of the retina inactive, nor do we witness saccades, the fast, jumpy movements our eyes are always making. We see a fake image. One of the modules in the brain recognize shapes - so we can give extra weight to a known shape like the moon. If it is near trees or other relatively close items on the horizon, we tend to see it bigger - but this is only our brain's processor getting things wrong. Seeing really shouldn't be believing.

Comments

  1. Love this idea of seeing not being believing. Must slip that into a novel somewhere....

    ReplyDelete
  2. I heard that the brain thinks the sky is shaped like an upturned bowl i.e. overhead objects are closer than objects on the horizon, so it 'corrects' by making things near the horizon look bigger.

    Whether this is true or not, many optical illusions supposedly depend on what the brain "thinks", even when we *know* the reality is different. I wonder if it's possible to train yourself not to see these illusions?

    ReplyDelete
  3. It's an old idea, Sue - it's why some Ancient Greek philosophers argued you should not rely on observations but rather on logic and argument (hence why their science as mostly so rubbish).

    Anon - it's an interesting thought, whether you could train yourself not to see them. I'm not sure how you'd go about it, though. The brain is also quite good at sorting things out - for example if you wear inverting spectacles, eventually the brain will flip the image back the 'right' way up.

    ReplyDelete

Post a Comment

Popular posts from this blog

Why I hate opera

If I'm honest, the title of this post is an exaggeration to make a point. I don't really hate opera. There are a couple of operas - notably Monteverdi's Incoranazione di Poppea and Purcell's Dido & Aeneas - that I quite like. But what I do find truly sickening is the reverence with which opera is treated, as if it were some particularly great art form. Nowhere was this more obvious than in ITV's recent gut-wrenchingly awful series Pop Star to Opera Star , where the likes of Alan Tichmarsh treated the real opera singers as if they were fragile pieces on Antiques Roadshow, and the music as if it were a gift of the gods. In my opinion - and I know not everyone agrees - opera is: Mediocre music Melodramatic plots Amateurishly hammy acting A forced and unpleasant singing style Ridiculously over-supported by public funds I won't even bother to go into any detail on the plots and the acting - this is just self-evident. But the other aspects need some ex

Is 5x3 the same as 3x5?

The Internet has gone mildly bonkers over a child in America who was marked down in a test because when asked to work out 5x3 by repeated addition he/she used 5+5+5 instead of 3+3+3+3+3. Those who support the teacher say that 5x3 means 'five lots of 3' where the complainants say that 'times' is commutative (reversible) so the distinction is meaningless as 5x3 and 3x5 are indistinguishable. It's certainly true that not all mathematical operations are commutative. I think we are all comfortable that 5-3 is not the same as 3-5.  However. This not true of multiplication (of numbers). And so if there is to be any distinction, it has to be in the use of English to interpret the 'x' sign. Unfortunately, even here there is no logical way of coming up with a definitive answer. I suspect most primary school teachers would expands 'times' as 'lots of' as mentioned above. So we get 5 x 3 as '5 lots of 3'. Unfortunately that only wor

Which idiot came up with percentage-based gradient signs

Rant warning: the contents of this post could sound like something produced by UKIP. I wish to make it clear that I do not in any way support or endorse that political party. In fact it gives me the creeps. Once upon a time, the signs for a steep hill on British roads displayed the gradient in a simple, easy-to-understand form. If the hill went up, say, one yard for every three yards forward it said '1 in 3'. Then some bureaucrat came along and decided that it would be a good idea to state the slope as a percentage. So now the sign for (say) a 1 in 10 slope says 10% (I think). That 'I think' is because the percentage-based slope is so unnatural. There are two ways we conventionally measure slopes. Either on X/Y coordiates (as in 1 in 4) or using degrees - say at a 15° angle. We don't measure them in percentages. It's easy to visualize a 1 in 3 slope, or a 30 degree angle. Much less obvious what a 33.333 recurring percent slope is. And what's a 100% slope