Thursday, 18 September 2014

Come on England, have some pride!

As the Scots go to the polls, I'd like to direct attention back home for a moment. The fact is, there have been some pretty unedifying scenes down here. We've seen political leaders and celebrities begging Scotland not to leave the union. I really don't understand why these people are so worked up. Perhaps they should put some effort into having pride in being English.

After all, in the grand scheme of things, England has a lot to be proud of - whether it's in cities and countryside, culture and heritage, literary fields, science or whatever. Take universities. It's interesting that the Guardian reported on Tuesday that 'Four British institutions ranked in top six of world's universities.' This is true - but it's also true that four English institutions ranked in top six of world's universities - because those four were Cambridge, Imperial College London, Oxford and University College London.

The fact is that England has around 90% of the population of the UK. For most of the world, England is the UK. Even if the entire union split up, England has the same population now as the entire UK had in 1961. By losing Scotland we're talking less than 10% change in population. Hardly makes us a tiny nation.

Of course there would be losses if Scotland went independent, but there would be plenty of gains too - including the chance for English people (and Welsh and Northern Irish too - I just happen to be English) to feel like they have more of a proper identity. This is not about nationalism. One thing I've learned from the interviews of Scots during the run up to the vote is how many said something to the effect of 'I'm not a nationalist - but I am proud to be a Scot.' For too long we've been scared that the only people who are proud to be English are right wing thugs. But it shouldn't be like that.

It's too late to change what has happened already - but politicians and celebrities, shame on you. Let's let Scotland make the best decision for itself, and start to think about ourselves with as much pride as they do.

Wednesday, 17 September 2014

What's in a cereal?

The other morning I was staring at the back of a cereal packet on the breakfast table, as you do, and read the contents list. Nothing extraordinary, until I started to look at the numbers involved and discovered the Nestlé seems to have something in common with the X-Factor. They believe that it's possible to give 110%.

In fact there are two significant oddities in that ingredients list. One is the matter of nuts. Because it says that the product (Honey Nut Shredded Wheat, if you must know) contains 10.5% nuts when in fact its only 0.3% - that's quite an error bar. This is because neither peanuts nor coconut are actually nuts. But we'll let them off, because there is probably some sort of convention that allows them to come under this heading. (It can't just because they have 'nut' in their name, as 'Honey Nut Shredded Wheat' has 'nut' in its name. So if that were the rule, the contents should read '100% nuts'.)

But the more interesting oddity is the maths. You might wonder what the problem is. With 84.1% wheat, 10.5% nuts and 2.8% honey, that still allows 2.6% for the other bits and pieces. But ingredients lists don't work like that. They have to be specified in order of weight - so there is more sugar than there is nuts, the list just doesn't mention how much sugar. With a minimum of 10.6% sugar, that makes a minimum contents of 108%.

We can get some idea of the quantity of sugar from the nutritional information. We are told that 100g of the product contains 15.9g of sugar - but we can't just take this number as the missing figure, as it will also include the sugar in the honey and molasses. So reasonably we can guess that the 'sugar' percentage is in the 10.6-12% range.

So what is going on? Thankfully, Nestlé has been helpful on the subject and told me this:
The basic maths does not add up and unfortunately this situation is replicated across many foods as they try to comply with QUID (Quantitative Ingredient Declaration) legislation. The complication comes from the requirement to list the amount of ingredients as they are added to the formula at each step. It is called the ‘mixing bowl’ rules. 
In a simple process, this works well and the ingredients add up to 100%. In a process with many steps, and where moisture is lost in intermediate drying and toasting stages, the maths becomes more complex and illogical, and 100% is hard to achieve.   Each product must be viewed in isolation, and its manufacturing method affects the final result as well as the ingredients used. 
We have to comply with 'The Food Labelling Regulations 1996' and its amendments.  There are two amendments which detail how we should declare the quantities of ingredients used, and the key requirement is in the second of these Amending Regulations, which states; 'Where the food has lost moisture as a result of treatment, the indication of quantity of the ingredient or category of ingredients used shall be expressed as a percentage which shall be determined by reference to the finished product”. 
 So there you have it. The percentages can't really be taken as sensible detailed information, just a broad brush guide. This doesn't of course, explain why peanuts and coconuts are nuts (no doubt another regulation), or why there is no percentage against sugar - but it does help us understand what is going on to allow Nestlé (and other food manufacturers) to give 110%.

Tuesday, 16 September 2014

Central heating and the change in watching position for Dr Who

In a Facebook discussion of the most recent episode of Dr Who (yes, that's the kind of exciting social life I have), Matt Brown expressed (mock?) surprise that people didn't push the sofas against walls in the old days - and suddenly one of the greatest mysteries of the universe clicked into place. It's all about hiding behind the sofa. (If you aren't from the UK, you may need assistance from the Wikipedia page on the subject.)

When I was little, I did, genuinely, watch Dr Who from behind the couch (we weren't posh enough to call it a sofa), so that it was possible to hide when it got really scary. And I was not alone. Most of the young nation used to do this. Yet it is a practice that has pretty much entirely died out. Why?

I had assumed it was because the yoof of today is far more cynical and exposed to horrors that make Dr Who look wimpish in the extreme. But there was no doubt that this Saturday's episode, Listen, was suitable behind-the-sofa material, especially the bit with the bedspread right behind them (you have to have been there). If you haven't seen the episode and have access to BBC iPlayer, I recommend it. And then Mr Brown made that simple remark.

Because the fact is, these days, many people do push their sofas against the walls, while back then they tended not to. There could be various reasons for this - fewer squarish living rooms now, and we have much bigger TVs, for instance. But my suspicion is that it could be central heating related. Like much of the UK, we didn't have central heating when Dr Who first aired. In our case not until 1966. Before then, on a winter evening, you didn't want your sofa miles away from the fire. So the seats tended to be more advanced into the room than they now would be.

Of course, this could be rubbish. But it's a theory. And even better, it's a Dr Who related nostalgic theory. What more could you ask?

Monday, 15 September 2014

The Room - review

Sorry, games again! But this is the last of the series.

After my recent dip into the nostalgia of game playing while reading the book on the makers of Doom, I just had to have a go at a game. There was a temptation to revisit the past and fire up a copy of the Seventh Guest or Doom itself (both available on Mac, though sadly my old favourite, the X-Wing series isn't so I would have make to do with Wing Commander III). And I may still do so, though as I pointed out in the piece on Netflix and games, I'm not sure I could make the time for serious playing time any more.

However, while perusing 'best of' lists to see what's recommended on the Mac at the moment, I noticed some 'best on iPad' games and was tempted to spend the enormous sum of 69p on a game called The Room - and I am so glad I did.

If you ever played something like Seventh Guest, this is a bit like the puzzles without all the wandering around. The Room limits you to a single table - but on that table is the most gorgeous, complex puzzle box you ever saw. And if you complete it and open the box - another, even more wonderful box emerges. One, for instance, turns into a gorgeous planetarium and orrery.

It's a bit murky, but this is a part of the level 2 (or is it 3?) puzzle box. The device on the front is a complex clock that you need to get going. Every flap, button, knob and locked door will eventually contribute something. 
For me, this is the ideal game for the Netflix generation. You can do it a bit at a time (although it is extremely more-ish, and the temptation is to just do one more clue). And there's no frustrating dying and going back to the start. You can do whatever you like in whatever order it presents itself and it will either not work or take you on a step.

It's hard to describe the puzzles without giving too much away, but they range from simple physical discoveries along the lines of 'if I turn that bit it will open a door in which I will find something', through the need to build a gear chain to get some machinery running to spotting an inscription on the back of a photograph that tells you in an obscure fashion how to position something you will discover later (and only be able to see through a special viewing glass). It is brilliant! And did I mention it was cheap? Even better, it's a couple of years old, so The Room 2 is waiting for when it's completed.

There is a hint system, but most of the time you can make progress without it. I'm so glad I read that book...

Friday, 12 September 2014

The Toffler scorecard part 2 - weathering heavy seas

A little while ago I took a step into Alvin Toffler's bestselling 1970 book Future Shock to see how its vision of the future has held up. Here's the second instalment.

Perhaps the biggest danger was always where science is involved, and in a chapter titled 'the scientific trajectory' we start off with a pair of unlikely projections.

The first concerns the oceans. As has often been observed, there are huge opportunities in the sea, particularly as we use up more and more land-based resources - and there is far more space than on the land - so it was common back then, and Toffler falls for it hook, line and sinker, to assume that we would see far more sea-based industry, and even underwater cities.

Toffler quotes Dr F. N. Spiess, heard of the Marine Physical Laboratory of the Scripps Institute as saying 'Within fifty years man will move onto and into the sea - occupying it and exploiting it as an integral part of his use of the planet for recreation, minerals, food, waste disposal, military and transportation operations, and, as populations grow, for actual living space.'

That 50 years is close - but very few of these predictions are. Yes, we make more use of underwater resources like oil and gas. But living on and in the sea is generally a very expensive and restrictive way of going about things, and there is no sign of it becoming commonplace. Toffler expected 'aqua-culture' to be as frequently used a term as agriculture by now. Maybe not.

I'm not quite sure why, but Toffler links his second dubious prediction to the first when he says 'The conquest of the oceans links up directly with the advance towards accurate weather prediction and, ultimately, climate control.' He quotes Dr Walter Orr Roberts, past president of the American Association for the Advancement of Science as saying 'We foresee bringing the entire globe under continuous weather observation by the mid-1970s - and at reasonable cost. And we envision, from this, vastly improved forecasting of storms, freezes, droughts, smog episodes - with attendant opportunities to avert disaster.' What they didn't realize was that the seeds of the failure of this prediction were already sown.

While it's true that weather forecasting has got a lot better since 1970, so has the understanding that we are never going to be able to predict weather more than a few days into the future. Through the 1970s and 80s an increased understanding of the nature of chaotic systems would make it obvious that it doesn't matter how good Dr Roberts' worldwide weather observation is, the weather system is just too complex and too susceptible to small changes in initial conditions producing huge changes down the line. I suppose I shouldn't be too hard on Toffler as we still regularly see presented as 'fact' forecasts outside the 10 day window, where a guess based on typical weather for the time of year is more accurate that a forecast. But the confidence in the predictions on weather forecasting and climate control vastly misunderstood both the nature and scale of the problem.

Sorry Alvin - this one's a 100% fail.

Thursday, 11 September 2014

Netflix killed the video (game) star

Thanks to reading Masters of Doom, I've been in a contemplative, and probably rather nostalgic mood about games over the last few days. I've stocked up on a couple of games as a direct result, but my suspicion is that I won't be playing them much. Certainly not as much as I once would have done. Why? There's a simple, one word answer. Netflix.

Here's the thing. There are broadly two types of gamer. The teen gamer who builds his/her life around game playing and the adult gamer who plays games when they've nothing better to do. I've primarily been the latter. Apart from anything else, computer games didn't exist when I was a teen. The first time I ever played one was running Adventure on the George III ICL system at Lancaster, but by then I was already 21.

Although at my gaming peak I could spend a a good few hours at a time playing (X-Wing and its offspring were particularly time-eating), as an adult, life has always had other attractions and games tended to be a way to fill in time when I had an evening to myself - a 'boy's night in', as it were. This was, in part, because the chances of their being anything captivating on the TV that night was pretty small. But these days, if I've an evening to myself, I can just delve into Netflix and consume great dollops of the binge-watch du jour. (For me, this happens to be Battlestar Galactica at the moment.)

Of course all those teens (literal teens or twenty-something plusses who are still channelling their inner teen) will still be obsessively playing. There is still a massive market for the big games, especially among those who appreciate the online multiplayer benefits. But for the less obsessive gamer, I really think that the ready availability of quality binge watches makes for strong competition. My suspicion is that it will make for more use of 'dip in, dip out' games like the excellent iPad game The Room (of which a review follows soon). But we shall see.

Wednesday, 10 September 2014

Boldly going

It's a nice coincidence that I recently wrote about Battlestar Galactica, because the whole business of being out there in space is the topic of my latest book which I'm pleased to say is now available. In Final Frontier we discover the massive challenges that face explorers, both human and robotic, to uncover the current and future technologies that could take us out into the galaxy and take a voyage of discovery where no one has gone before...but one day someone will. In 2003, General Wesley Clark set the US nation a challenge to produce the technology that would enable new pioneers to explore the galaxy.

That challenge is tough - the greatest humanity has ever faced. But taking on the final frontier does not have to be a fantasy. In a time of recession, escapism is always popular - and what greater escape from the everyday can there be than the chance of leaving Earth's bounds and exploring the universe? With a rich popular culture heritage in science fiction movies, books and TV shows, this is a subject that I just couldn't resist and, like geeks everywhere, find fascinating.

One of the joys of writing a book like this is you find out a lot more about a topic that has always intrigued you. It's not that I've always wanted to be an astronaut - I'm far too fond of home comforts and minimising personal risk for that - but as a real-life story you can get your heart behind, it's hard to resist. I'm old enough to have been allowed to stay up all night by my parents to watch the Apollo 11 moon landing - and it's one of the most powerful memories of my childhood. And at the same time, I've boldly gone in fiction with Dr Who, Star Trek, Star Wars and so many more works of fiction, particularly in book form.

So the emotional connection was there already. But two things have really stood out for me in pulling together Final Frontier. One is the need to go beyond the traditional nationalism at the heart of early space exploration. Future manned exploration of space would benefit hugely from being an international venture, and, as recent developments have demonstrated, a mix of private and public funding.

The second is to detach space travel from science. I have always heartily agreed with those who say that having manned space vessels is a terrible way to do science. It is vastly more expensive than using unmanned probes and unnecessarily puts human life at risk. It would help enormously if we totally separated the two reasons for venturing into space. Science needs great unmanned probes. But humanity needs people out there. I'd suggest that rather than fighting over a relatively small science budget, manned space travel should be lumped in with the defence budget, as it would transfer cash from the dark side to the positive side of the human spirit, and arguably it has the same goal of expanding the cause of human survival, though in a much less nationalistic fashion.

We shouldn't send people out into space to do science (although they are welcome to do some while there). Instead, such an adventure (in the literal sense) should be to fulfil the human spirit that makes us more than just animals that live to breed and die. And that's kind of important.

You can find out more about Final Frontier at its web page, or buy a copy at Amazon.co.uk and Amazon.com.

Here's what the inestimable John Gribbin said about it:
An enjoyable romp across space and time, from Cyrano de Bergerac to future space-warp driven interstellar craft, via Verne, Wells and the possibility of colonising the solar system. 

Tuesday, 9 September 2014

On the road to Doom - review

I was delighted when someone pointed out the book Masters of Doom. It's not a new title, dating back to 2003, but it covers a period that anyone of a certain age with an interest in computer games will regard with interest.

Describing the rise and fall of the two creators of id software, John Carmack and John Romero, it is a classic silicon valley business/bio - with some particularly extreme characters. I knew nothing of these people at the time, but reading the book brought on waves of nostalgia as they were responsible for three of the key milestones in gaming history. I was still programming PCs when Wolfenstein 3D came out and I remember being amazed by the effects and responsiveness they coaxed out of the early PC's terrible graphics. By the time Doom and Quake came along, I was reviewing games for a living. Though my personal tastes ran more to the X-Wing series and Seventh Guest, I was stunned by the capabilities of the id games. They were the only first person shooters I ever found interesting - and each moved on the field immensely. All the first person shooters that are popular today from Call of Duty and Halo to Destiny owe them so much.

So from a techie viewpoint, this was fascinating, though the author does tend to rather brush over the technical side to keep the story flowing. And from the personal side, there were plenty of fireworks too. While the book slightly overplays the traditional US business biography style of presenting disasters and triumphs to regularly fit chapter boundaries, there is no doubt there was a real roller-coaster of an existence in a way that all those reality TV stars who overuse that term wouldn't possibly understand.

Although there are plenty of other characters, the two Johns are at the book's heart - Carmack the technology wizard behind the engines that powered these worlds, and Romero the designer and flamboyant gamer. The pair inevitably clash on direction and when they split it's interesting that it's the John who doesn't go for the classic US software developer heaven of turning the offices into a playground who succeeds.

All in all, truly wonderful for anyone who was into games in that period (and should be of interest to those who have followed them since). It's a shame it stops in 2003, as things have moved on a lot since its 'how the main characters are now' epilogue - but a quick visit to Wikipedia can bring you up to speed.

You can buy Masters of Doom at Amazon.co.uk and Amazon.com.

Monday, 8 September 2014

Is £10 an hour a sensible target for the minimum wage?

I was interested to read that the Green Party of England and Wales is proposing that we should immediately raise the minimum wage from the current £6.50 to a living wage (currently £7.65 an hour outside London) and that by 2020 they say that the minimum wage should be £10 an hour.

I am generally in favour of allowing markets to set prices, and at first glance, if someone is prepared to do a job for a certain amount, then it might seem unreasonable to pay them more. But there are good reasons to have a minimum wage at what is, frankly, the very reasonable level suggested as a living wage.

Apart from anything else, if someone is paid less than a living wage, then they end up being supported by the benefit system - so that just means more taxes for the rest of us. If someone is doing a job then they ought to be able to live on the proceeds of a reasonable working week. Anything less is close to concealed slavery.  Let's have that living wage now, please, government - and why doesn't it also apply to 18-20 year olds who get a pathetic £5.13 minimum wage at the moment?

However, despite my whole-hearted support for the living wage, I can't support the Green Party policy of a £10 target, as it is entirely arbitrary. There are two suspicious things about it. One is the round number nature of £10. This shouts out that it is a number picked out of the air that sounds impressive because it has two digits. The other is having a target for 2020. Unless the Green Party has a time machine they haven't told us about, that's just too far ahead to make accurate forecasts for. We don't know what inflation will be. We don't know what the economy will be like - and to make a commitment to a specific number seems crazy.

What would be much better, but less attention grabbing than that £10 number, would be to have a target of maintaining the minimum wage at a living wage level, year on year. That would be far more practical and meaningful. And it could mean a minimum wage of more than £10 in 2020 - we can't know, of course, we just know it would be the right amount, where £10 certainly won't be. So how about it Green Party? Can you move away from PR-based politics (the driving force, sadly of most green activity) and do something that really would be a good thing? We shall see.

"Green Party of England and Wales logo" by The logo is from the http://www.greenparty.org.uk/ website.. Licensed under Fair use of copyrighted material in the context of Green Party of England and Wales via Wikipedia

Friday, 5 September 2014

The Toffler Scorecard Part 1 - Disposability

My rather battered
version of Future Shock
Way back in 1970, when the world was very different 'futurologist' (I hate that word) Alvin Toffler produced an immensely popular book called Future Shock that predicted what he believed life would be like in the twenty-first century. In a series of posts I'm looking back at some of Toffler's predictions to see how they've turned out and what that can tell us about then and now.

Reflecting the change, particularly in America, that had brought in more and more of a throw-away society, Toffler envisaged a future where this approach was taken to the extreme. Apparently, in 1970 paper dresses were all the rage (I can't say I remember this), and wear-once-then-throw-away clothes were something Toffler assumed would become the norm. I don't know if he lived in Florida or California, but realistically paper clothes were always a non-starter as anything more than a gimmick - certainly in Manchester or Scotland, say. But is certainly true that the current young generation does think of clothes as more short-term purchases than a generation that bought clothes and kept them until the wore out. (My raincoat is over 30 years old and still going strong.)

However, what Toffler missed is the way that an awareness of green issues would become a natural background to life. While the younger generation don't hang onto clothes they way some older folk do, they also don't just throw them away. Instead they resort to recycling, whether via charity shops or services like eBay and Depop. And the same goes for much of our everyday things. Yes, we do change some products a lot more than we used to, but equally we tend to recycle them, ideally for money. It would have seemed crazy in 1970 to change your phone ever two years, say (it would, have course, have been a landline phone), but when we do make the change, we trade in the old one, or sell it.

On balance, then, this is a 50:50 prediction. Neither a hit nor a miss. We certainly do treat far more things as temporary than we used to. With technology, particularly, we feel driven to upgrade. I do have one bit of ICT kit that is over 10 years old (an HP LaserJet printer that simply does the job), but the average age of my ICT is probably about 2 years. Strangely, though, despite this, we are in a society less inclined to throw-away than Toffler's. We reuse, repurpose, recycle. Where he described a tendency to increasingly knock down old buildings, we (at least in the UK) now tend to treasure them and reuse them more than was the case in the 70s. It's ethical disposability. And that's rather interesting.

If you want to discover Toffler's predictions for yourself, you can buy Future Shock at Amazon.co.uk and Amazon.com.

Thursday, 4 September 2014

Scrubs up well


Your great grandma might not have known about phenol - but she certainly would be familiar with carbolic, the harsh soap that included carbolic acid, now properly known as phenol. This simple aromatic compound might have dropped out of our morning cleansing routine (thankfully) but it has more recent roles from the production of aspirin to Agent Orange.

Discover more in my latest Royal Society of Chemistry podcast about phenol. Take a listen by clicking play on the bar at the top of the page - or if that doesn't work for you, pop over to its page on the RSC site.









Wednesday, 3 September 2014

Demodex-traordinary

An eyelash mite
I had the pleasure of appearing on Radio Scotland yesterday. No, not to discuss the Independence vote, but the matter of eyelash mites.

When I wrote The Universe Inside You, which uses the human body as a starting point for exploring all kinds of science from the nature of light to evolution, I just had to include (with a title like that) the veritable zoo of creatures that call our bodies home. Of course I explored the bacteria, which, with ten times as many bacterial cells in the body than human, are pretty impressive. But I also included Demodex, the eyelash mite.

These tiny little arachnids - typically 1/4 to 1/3 of a millimetre in length - feed on sloughed skin and sebaceous oil, in effect clean-up scavengers. They are transparent and pretty well impossible to see, mostly living at the base of eyelashes and eyebrow hair. What I said in UiY is that it was thought that around half of adults have them, but the reason they had become news, featured in national newspapers and on Radio Scotland, was that a study had shown that all adults had them. (Or at least, that's how it was interpreted. More on this in a moment.)

There was some interesting psychology as to why this change made them news. I suspect it is because it went from feeling like something like head lice that other people have (until there's an outbreak at your children's school) to something you have.

In fact the study is both more interesting and limited that the reporting suggested. The PLOS One paper does not actually say that mites were discovered on 100% of adults - in fact they were only spotted on 14% of adults, as it's hard to do. But what the researchers did was to take a sample of sebum and search for Demodex DNA. They discovered it on 100% of adults over 18 and 70% of eighteen-year-olds. Admittedly this isn't a perfect determinant, but as the paper puts it 'Though it is possible Demodex 16S rDNA could be found on the face of an individual without mites, the likelihood that we detect such transferred DNA in our limited sampling area would be low.'

So an interesting development. One of the conclusions was 'The diversity of D. brevis 18S rDNA found on individual humans suggests that not only do all adult humans have Demodex mites but that colonization is likely to occur more than once.' This is the interpretation that I'm a little worried about. The study is based on DNA testing on 19 adults, all from Raleigh NC. I'm not convinced that this provides sufficient data to make the the sweeping statement that all adult humans have Demodex mites - which then led to the news flurry. It may well be true, but this seems a very small sample to build that conclusion on - though its clear that the mites are significantly more prevalent than previously expected.

A bit of fun, though. Got itchy eyebrows? I thought so.

Image "Haarbalgmilbe". Licensed under Creative Commons Attribution-Share Alike 3.0 via Wikimedia Commons