Friday, 29 January 2016

Another competitor for 'overblown science headline of the year'

Thanks to Ian Bald for pointing out the impressive headline 'The Death of Relativity Lurks in a Black Hole's Shadow' in Wired.

What's so impressive here is just how much it's possible to get wrong in a single headline. Black holes, of course, don't have 'shadows.' I think what they mean is its event horizon, though the article is so fuzzy it's difficult to be sure.

However, the real shocker is the apparent claim that general relativity is dead. Here's the thing. No it isn't. What the article actually says is that if a black hole's 'shadow' (event horizon?) isn't perfectly spherical or isn't just the right size for it's mass, then general relativity's predictions would be wrong. Well, duh. This would also be true if it were pink or singing the Stars and Stripes. Note however, that no one has discovered that its shape or size is different from prediction. (Or that it's pink.) They're just saying that we might be close to being able to make a measurement to see if it lives up to prediction. That's all.

Even if there is a disparity, as the article says 'If Einstein is wrong, general relativity won’t go away—it’s too good at what it does. It just won’t be the whole story anymore.' Right. And that fits with the headline how?

I appreciate editors want headlines that grab people's attention, but if they are going to deviate so far from the facts in order to do so, why not go the whole hog? I look forward to the headline on an article about a new extrasolar planet, where the story is that it's about the right size for life to read:


Why not? It makes as much sense.

Thursday, 28 January 2016

Hover cars, tri-vees and v-phones

I've just re-read a science fiction book I quite liked as a teenager. Called Prisoner of Fire, it's by the now largely forgotten British science fiction author Edmund Cooper. Back in the 70s he was quite popular and wrote a whole string of SF novels, but, to be honest, I can see why he's largely forgotten. The writing style seems from a different era, mannered and dated.

However, Cooper's ideas are still interesting. The topic of the book is the common enough SF trope of paranormals - it features a number of children with mental abilities, able to read minds, to block other telepaths, or to kill remotely. The reason it's interesting is that Cooper examines what such a capability would mean for governments, both in terms of protecting themselves and espionage, and how it could lead to an 'end justifies the means' attitude to the young telepaths as weapons.

That's not why I bring the book up now, though, so forgive the long introduction. Prisoner of Fire, written in the 1970s and set in the 1990s, wildly over-predicts technological advances. This was a common problem in the 60s and 70s. Things had moved on so much since the Second World War that it was assumed changes would be even more dramatic in the next 20-30 years. (Think how much 2001, A Space Odyssey from the late 60s overshoots what 2001 was like.) So Cooper merrily deployed hover cars, tri-vees and v-phones.

As I point out in Ten Billion Tomorrows, the problem with hover cars and tri-vees (in case you hadn't guessed, TVs that project a 3D image into empty space, a bit like the Princess Leia bit in the first Star Wars), the problem is that the authors weren't really thinking through the technological advances required. We just don't know how to practically make a car that floats or to project a hologram onto thin air. However, the v-phone is a more interesting one because we effectively have the technology but rarely use it.

I assume v-phones were video phones (it's never explicitly explained). These days, pretty well any smartphone or internet-connected computer is, in effect, a video phone. Using Skype or FaceTime, we can make video calls. And occasionally, for example, when family is on the other side of the world, they are very effective. But for 99% of our calling we don't use them. Because they feel strangely unnatural. The assumption has always been (video phones have been talked about since Edison and Tesla's day) that we would get the same benefits from a video call as a face-to-face conversation. And this can work with a sophisticated video conferencing setup. But for a chat on the phone it's a disaster.

There seem to be a number of reasons for this. One is that a smartphone is too up close and personal. It doesn't seem quite as bad on a computer where you can sit well back from the camera, because the view will take in a fair amount of the room. But on a smartphone video call, the other person's face pretty much fills the screen. To have an actual conversation with a person whose face fills your vision to that extent you would need to be around 10cm away from their face - a position that we just don't have conversations in, even with intimates, let alone strangers.

Another difficulty is focus. Although good listeners spend a lot of time looking at the person they are talking to, they also look away a fair amount, if only for fractions of a second. A solid focus on someone's face is intimidating. But in a video call, the other person is pretty much constantly looking straight at you.

Finally (I'm sure there are more issues, but these are the three that occurred to me), we don't always want the exposure that comes with being seen. A telephone is useful for many conversations precisely because we don't want to give too much away, whether it's because we're answering the phone in our pyjamas, because the room is a mess, or because it makes it easier to lie.

So poor old Edmund failed on all three, but the v-phones were arguably the most interesting fail because it's technology we can use, yet mostly choose not to.

Wednesday, 27 January 2016

Fascinating mangling of falsification

I have just read an article (don't ask me why - this is the wonder of Facebook) which tried to defend Mormonism from the worrying details of its origins. The piece included this:
Many intellectuals argue that “negative evidence” is supreme. To understand what they mean by this, consider the hypothesis that “all swans are white.” According to these intellectuals, it doesn’t matter how many white swans you find, you never really prove that “all” swans are white. However, as soon as you find one black swan, you have disproved the theory that “all swans are white.” They conclude that positive evidence doesn’t ever really prove anything, but negative evidence can. And it’s easy to see why they think that way. 
This is the approach that ex-Mormons have taken to their faith. In the face of unsettling information, they disregard all of the positive evidence because they think that a few points of negative evidence is sufficient to end the discussion. And given how logical the above reasoning seems to be, it is no wonder why. But they are still wrong. 
To understand why, consider another example. After first discovering the planet Uranus, astronomers attempted to predict its orbit by using Sir Isaac Newton’s laws of physics. They could observe the orbit of Uranus with their own eyes, but when they used Newton’s mathematical models to predict that orbit, they failed time and again. It made no sense. Newton’s laws had been right about so many things, but astronomers had found a case in which Newton’s laws did not work. So, was Newton wrong? Were his laws not quite as infallible as they had seemed? In light of this “negative evidence,” it would have been easy to conclude just that. 
However, years later, astronomers discovered another planet, Neptune. And as it turns out, when astronomers accounted for the mass of this newly discovered planet, Newton’s laws predicted the orbit of Uranus perfectly. So, as it turned out, it wasn’t that Newton’s laws of physics didn’t work. It was that they didn’t seem to work. And that’s because the astronomers simply didn’t have all the relevant information and context.
There's so much to get your teeth into here, but we'll pick out two key points. First there's the ad hominem attack. 'Many intellectuals argue... According to these intellectuals... and it's easy to see why they think this way.' Implication: intellectuals don't know what they are talking about. Don't listen to them. Note particularly 'According to these intellectuals, it doesn't matter how many white swans you find.' Forget 'According to intellectuals.' It's just true. It doesn't matter how many white swans you find. All swans are not white. Are they arguing otherwise?

However, no one suggests that falsification is usefully applicable to everything. Which is why it's odd that they then give an example where it isn't properly used. All scientific evidence is provisional. The black swan disproves the 'all swans are white' hypothesis, and that is the best data at the time and the only sensible viewpoint. But should it later prove that the 'black swan' was an unusual variant of goose and not a swan at all, the hypothesis could recover. However, the Newton example used in the extract from the article above fails on a number of counts.

First, the orbit of Uranus didn't show that 'Newton's laws of physics don't work' it showed that they didn't apply in that circumstance. There are plenty of other examples (Mercury's orbit, for instance) where they will never apply. As it happened, in the case of Uranus, it was because the astronomers didn't take into account the full situation. But there was nothing wrong with the assertion that Newton's law of gravitation didn't correctly describe the orbit of Uranus in the known solar system of the time. And until other factors were brought in, one possibility was that this was a case (like the orbit of Mercury) where Newton's law wasn't appropriate.

This argument is then used to suggest that yes, there are worrying aspects of the early history of Mormonism that cast its basis into doubt. Until you can show why that negative evidence is misleading - and that isn't happening - you can have all the positive evidence you like (which is what, exactly?) and the negative evidence still stands. Even in the Uranus example, the results showed their was something wrong with the astronomers' assumptions. Falsification remains a powerful tool, and a valuable one in cases like this.

Tuesday, 26 January 2016

The Sex Life of a Comedian - review

Written by stand-up comedian Dave Thompson, The Sex Life of a Comedian delivers some home truths about the life on the road. It's hard not to believe that the main character Doug Tucker's last minute arrivals at venues, or the way he spends hours crossing the country for underpaid gigs in unpleasant dives, don't have some inspiration in reality. As, I suspect, do the way that the various comedians who come into Tucker's life become huge successes or fail in ways that are little connected to their talent.

However, this is a novel, not a memoir, for which we can assume Thompson is thankful, because Doug Tucker's life is no bed of roses. Admittedly Doug seems to enjoy (with occasional regret and shame) the huge amount of (explicit) sex and drug taking he encounters, but there is a murky unpleasantness to his existence that shades into outright gangland violence. And Doug's luck rarely stays positive for long, while the car crash events that pull his life apart come with painful regularity.

To begin with, as we hear of Doug's extremely colourful sex life, it's hard not to think of this as a modern version of those 1970s 'Confessions of a Window Cleaner' type books. I never read one, but my suspicion is that they would have had the same, rather simplistic narrating style, with a series of sexual escapades (though I doubt if the content of these were as explicitly portrayed as are Doug's). But as the Tucker story develops, I was reminded much more of the books of the once extremely famous Leslie Thomas.

In part it was the period feel - the first part of the book is set in the nineties, but it feels more like the seventies - but mostly it was the similarity to Leslie Thomas's classic story arc of a likeable but weak-willed central character who is manipulated sexually and practically by unpleasant people to produce a trajectory that begins with a degree of success but that ends in a disastrous spiral of destruction. Like the best of Thomas's central characters, Doug Tucker has an element of an innocent, introduced to a dark world that seems first enticing and then destructive. And he has a hint of mystery about him with his rabid dislike of children and frequent reference to his mummy's knife.

I had been warned about the sex scenes, which are definitely not suitable for reading on the train if the person next to you glances at your book (as I experienced), but I found the rampant drug taking more disturbing, while the ending seemed rushed and not entirely satisfactory. I certainly wouldn't buy this if you are easily shocked or looking for a jolly romp, rather than a gut-wrenching story. However, by the time I was a quarter of the way in I had to discover Doug Tucker's fate, and it's a book that I won't forget in quite a while.

The Sex Life of a Comedian is available as an ebook on and, or as a paperback from

Monday, 25 January 2016

I am not a number

I've just read The End of Average for review, and I couldn't help letting out a little whoop of joy when it totally trashed psychometric testing.

I am talking about mechanisms like the Myers Briggs type profile, along with a whole host of rivals, all used by businesses in recruiting and team building to analyse a personality and assess how an individual will work with others. 

The problems I have always had with the approach are several-fold. It's based primarily on Jungian theory which has little scientific basis. Your personality type is self-determined, so, while it's not surprising it often feels right, that doesn't make it accurate. And I was always doubtful about the cultural norms of the mostly US-devised tests being applied worldwide. Infamously there used to be a question about whether you preferred a gun or something constructive (I can't remember what) - which clearly would have different resonance in the US and Europe. 

Now, though, there are much stronger grounds for concern. The End of Average points out that personality profiles don't reflect the behaviour of individuals, but rather they predict the average behaviour of a group of people, which isn't the same thing. If you are an ENTP like me, it doesn't say how you will behave, but how, on average, people with the same profile will behave. As the book says 'In fact, correlations between personality traits and behaviours that should be related - such as aggression and getting into fights, or extroversion and going to parties - are rarely stronger than 0.3.' The same applies to academic achievement and professional accomplishments. This means your personality traits, as identified by the test should reflect around 9 per cent of your actual behaviour, while getting over 90 per cent wrong.

Underlying this is the relatively recent (if entirely obvious) discovery that we don't have one personality/behaviour but it varies depending on the situation. A teenager, for instance, behaves very differently with a group of peers and with his or her grandmother. That's obvious. So why do we expect a single score on a handful of dimensions to reflect how we will behave in all circumstances? It's bizarre.

I don't expect companies to stop using these tests any time soon. Come on - some still use 'graphology', expecting handwriting to give insights into personality. But employers and academics should at least be thinking twice about what they are testing and why.

Wednesday, 20 January 2016

Beam up a bug

When I wrote The God Effect about entanglement around 10 years ago it seemed that many of the remarkable possibilities that emerged from this strangest effect of quantum physics were close to practical applications. As it happens, there's nothing in the book that's gone out of date - but we do keep getting incremental announcements in the field. Most recently we had the more over-the-top media sites telling us 'Scientists teleport bacteria' while the more careful came up with the confusing sounding 'Physicists propose the first scheme to teleport the memory of an organism.'

I'd need a whole book to go into quantum entanglement (:-)), but the summary version is that quantum physics predicts that, for instance, you can get a pair of quantum particles into an entangled state where making a measurement of a property of one (its spin, for instance) will instantly influence the other particle, however far apart they are. And this has been experimentally verified many times since the 1980s. Entanglement can't be used for what seems the obvious application - sending an instantaneous message - as the 'information' transmitted is random. However it can act as a linking mechanism to do things that would otherwise be impossible.

In the case of the impressive-sounding teleportation, the apparent impossibility that entanglement can help with is the so-called 'no cloning theorem.' It was proved reasonably early that it is impossible to make a duplicate of a quantum particle while preserving the original. However, with a mix of entanglement and conventional information transfer, it is possible to transfer a property of a quantum particle to a similar particle elsewhere, in effect making a remote copy of at least one aspect of the particle. In the process, the original particle's properties are altered (so you don't end up with an identical pair) and you never find out what the property's value was.

Despite these provisos, if you could do this for all the significant properties of a particle - or a collection of particles - it would be as if the original particle had been teleported from its original position to the location of the modified particle. In effect, to use the inevitable simile, it would be like putting the particle through a Star Trek transporter. This mechanism in its simplest form is already valuable for applications like quantum computing, but inevitably there was interest in doing it for real - all the significant properties - and with something bigger than a single particle.

It ought to be stressed this is never going to produce a Star Trek transporter, whether as Amazon's latest way to deliver goods or to avoid the rigours of air travel. This is because of the sheer number of particles in an sizeable object, which would take thousands of years to scan and reassemble. If we're talking a product, you don't need an atomic level duplicate - you can just send the instructions for a factory to make it. If we're talking a person, even if you got over the fact that the original 'you' would be disintegrated and only a perfect copy was produced, that timescale is simply impractical.

Over the years we've seen various properties of particles and simple molecules teleported. And it would be fascinating if it were possible to teleport a virus or bacterium. However, it should be stressed that this is not what has happened here. Firstly, nothing has actually happened. It's a proposed mechanism, not an experiment that has been carried out. And secondly we have to be clear what's meant by that 'teleport the memory' headline. In more detail, Tongcang Li at Purdue University and Zhang-qi Yin at Tsinghua University have suggested a way to use electromechanical oscillators and superconducting circuits to teleport the internal quantum state and center-of-mass motion state of a microorganism.

What it essentially means is that they may be able to transfer as a package some of the states of molecules in the bacterium to another organism. As these states are a form of information, they are described as teleporting memories. There are a few provisos, however. To make the system work, the organisms would have to be cryogenically frozen. They would not be alive. And what isn't made clear is how the setup would deal with the reality that any two bacteria are not identical in their molecular makeup. But the  theoretical experiment is interesting in the way it accesses internal properties of the organism for teleportation, rather than expecting it to be stripped down, particle by particle.

You can, in principle, see more in the original paper, but unfortunately it is a pay access.

Tuesday, 19 January 2016

Whatever happened to Second Life?

I've just read Snow Crash, which features a virtual environment, the Metaverse, so like a super-version of Second Life that I'm almost certain the designers of SL must have been inspired by the book. And reading about it reminded me just how seriously people were taking this online virtual reality environment a few years ago - yet now Second Life appears to have dropped off the radar.

When I first started blogging in the Nature Network, that august publication was arranging seminars in Second Life, companies were holding meetings in it, and people were making fortunes selling Second Life wares. I thought the whole concept of meeting up in a tacky virtual environment was crazy - surely video was far better - yet the media and many big companies were convinced that the hip audience would flock to this kind of thing. But now it's all gone rather quiet.

I've never bothered with Second Life myself, and a straw poll on Facebook got me no response from anyone who uses it seriously, but from what I've read by those who do still frequent it, the main section of the Second Life world has become like a ghost-town after the makers set up an adults only continent - apparently that's still very lively, but obviously its dubious attractions are not why all those big names of science, technology and business were setting up SL presences - and their idea clearly has fallen apart.

I can't say I'm sad - it always seemed more a collective delusion than a sensible way forward. I'd love to hear from anyone who was an SL fan or involved in corporate use of it, like that at Nature. Do let us know the whys and wherefores - and whether you still think it was a good idea.

Monday, 18 January 2016

Snow Crash - Review

I've enjoyed several of Neal Stephenson's books, but find many of them far too long, suffering from bestselling author bloatitis, so I thought it would be interesting to get hold of a copy of his classic, Snow Crash - and I'm very glad I did.

Although not a pastiche, it depends heavily on four classics of science fiction. The obvious one is William Gibson's Neuromancer, because of the net-based cyberpunk aspects that are central to Snow Crash. (The snow crash of the title is nothing to do with skiing and everything to do with computers crashing.) However, the pace and glitteriness owes a huge amount to Alfred Bester's Tiger Tiger (that's the UK title - it was originally The Stars my Destination), while the corporate-run world has a distinct feel of Pohl and Kornbluth's Gladiator at Law,  though interestingly here it's a world without any laws whatsoever. And finally there's a touch of Samuel Delaney's Babel-17, where a language is capable of doing more than simply describe things. In Delaney's book, the language is so specific that if you name something, you can construct it given only that name - here, language is capable of re-programming the human brain.

These influences, though, are only for those who are interested. If you like the kind of science fiction that hits you between the eyes and flings you into a high-octane cyber-world, particularly if you have an IT background, this is a masterpiece. Once you get over the odd name of the hero/protagonist (he's called Hiro Protagonist. Really) it is a joy to read. And despite being over two decades old, the technology really doesn't grate. Okay, Stephenson set it too early for the level of virtual reality capability, and there are too many references to video tapes, but otherwise it could have been written yesterday. What's particularly remarkable is that it is all about the internet (if not named as such) at a time when the internet wasn't widely known. This was written in 1992, yet when Microsoft launched Windows 95, it wasn't considered necessary to give any thought to the internet. That's how quickly things have changed.

As you might expect from Stephenson, there are some dramatic set-piece fights and rather a lot of violence, virtual and actual, but it also features erudite and quite lengthy library exposition of the precursor myths to many modern religions and some mind-boggling (if far-fetched) ideas about language, the nature of the Babel event and of speaking in tongues. There's also a strong female character, though today's readers might raise an eyebrow about a relationship between a 15-year-old girl and a thirty-something mass murderer. Oh, and I love the rat things.

If you find some of Stephenson's more recent books overblown, this is the one to go back to. Nicely done indeed.

Snow Crash is available from and

Thursday, 14 January 2016

Sometimes doing this job gives you a warm glow

I'm the first to admit that, if you can make a living doing it*, being a writer is not a bad gig. You don't have to set the alarm in the morning. It's part of your job to read interesting stuff and mooch about being creative. Admittedly the writing part is a bit of a faff, but, hey, everyone has downsides to their job. But there is one aspect of it that can be particularly pleasant - when someone says one of your books has inspired them.

According to the incomparable Cumbria Crack (I never get my news anywhere else**) Keswick student James Firth, who has won a science bursary, wants to study astrophysics and has a long term aim of following Tim Peake into space. And, it seems, 'James has a fascination in astrophysics having poured through books by Stephen Hawking and Brian Clegg. He now aims to study astrophysics at university.'

I wish James every success and am genuinely delighted to have played a small part in helping inspire his fascination with science.

* This is not the easy part. Note that, according to an ALCS/ Queen Mary University study, the median professional author income from writing in the UK in 2013 was around £10,000.

** Okay, not entirely true.

Wednesday, 13 January 2016

Does celebrity make you real?

This morning I spotted an email from my local theatre that was too good not to share, as it appeared to be selling off a comedian.

As you do, when I tweeted about this I included the Twitter address (handle? ID? none of them work well) of the comedian in question, Ross Noble, and I noticed that, like a fair number of famousish people,  (presumably the ones that aren't rich enough to buy the person who already has, in this case, @rossnoble) he has resorted to putting 'real' in front of his name, making him @realrossnoble.

There are plenty of others - David Mitchell (the comedian, not the novelist) is @RealDMitchell, for instance. I assume this has happened because someone else called Ross Noble, David Mitchell (still not the novelist) etc. has already snapped up the simple form, like my @brianclegg.

It's fine, obviously, to modify your name to be both memorable and still clearly like to your name - much better, certainly than @rossnoble99 or @nobleross. But I do wonder if stars of stage and screen are the best people to apply the word 'real' to themselves? It's not that I'm suggesting that they are fictional, but my suspicion is that they have a weaker grasp on reality than most of us (with the exception of politicians and royalty of course (I wonder who has @realqueen?)).

So perhaps celebrities should consider an alternative to the 'real' prefix. How about @unrealrossnoble or @famousrossnoble or just plain @rossnobleyouveheardof. That way, they wouldn't be making unnatural claims.

Tuesday, 12 January 2016

I don't really get music

Another group I enjoyed singling with in my youth -
Nonessence (clearly hip and fashion-conscious)
It may be a matter of having slightly different mental structures or something, but I struggle to understand the importance most people seem to place on music.

This might seem odd, as I've always loved performing with smallish groups of singers, mostly notably Selwyn College Chapel Choir, and I often play music when doing admin tasks (I can never write with music on). I've even enjoyed a few of the concerts I've attended, though if I'm honest, by about 2/3 of the way through a gig I've usually had enough and am getting a bit bored. But what I can't understand, as evidenced by the outpouring after the death of David Bowie, is the way so many people say that music changed their life or was central to it.

I'm not doing down Bowie - I think he was brilliant, creative and a one-off. But I don't understand how music can do anything to your life, or how a musician can be a hero or role model. I read, for instance, Suzanne Moore in the Guardian saying 'What he gave to me is forever mine because he formed me... He was my lodestar...' For me, music is just another type of entertainment, and if I have to give something my whole concentration as an audience member, as opposed to a performer, I'd rather it were a book or a film.

I ought to stress this isn't an attack on those who do put music at the centre of their lives, as so many seem to. But I honestly don't get it - I don't feel anything like they seem to. My loss, I'm sure, but just emphasising, I guess that all brains are not wired the same.

Monday, 11 January 2016

Pilgrim's Progress - The Extra Mile review

There's something rather appealing about the concept of pilgrimage, whether or not you have a religious faith. It's a bit like a combination of the pleasures of walking and trainspotting (and I don't mean that as an insult) - there's both the exertion that is usually involved and the feeling of ticking off achievements, a medieval equivalent of a bucket list.

In The Extra Mile, Peter Stanford sets out to take in a number of centres of pilgrimage in the UK, without indulging in the actual act of being a pilgrim himself. Even though several times he is drawn into the experience, he undertakes this as an observer rather than a participant. These are mostly Christian sites, but he also takes in the pagan/Druidic possibilities of Stonehenge and Glastonbury.

I came on this book by accident - I think it was an Amazon recommendation when I was looking at something else - and I am pleased that I did. Stanford gives accounts of what he experiences, from the lively celebrations at Glastonbury to more contemplative island retreats like Iona and Lindisfarne. For me, the most interesting was probably Holy Well in North Wales, the most complete of the medieval pilgrimage shrines and a fascinating piece of architecture whatever you think of the supposed properties of the water.

I'm not sure whether it helps or not that Stanford comes across as a cool, detached observer. I assumed from his slightly fussy writing style that he was retired, but he apparently had young children at the time of his trip (mid 2000's), so was probably younger than he sounds. He is often a little sceptical of what is going on, especially where the events clearly bear little connection to the origins of the site, but never mocks the participants and is not truly critical of anything. He also occasionally admits that the spirituality of his Catholic upbringing had crept in unbidden.

If, like me, you have an interest in medieval British culture, or you want to know more about British religious traditions, which certainly extend far beyond the typical modern establishments, it is genuinely interesting. If you're a Brother Cadfael fan, you'll even discover some of the real history behind the St Winifred story that appears in the series of books. I'm not sure that The Extra Mile works hugely well as a travel book, though, and it lacks the warmth and humour of writers like Bill Bryson. But it certainly highlights several locations that won't be as well known as Stonehenge and that might be worthy of a visit.

As a book, then, it could have been more engaging, but for an insight into both early British religious practices and how they have extended into the present and have been adapted to modern ways it is definitely worth a read.

The Extra Mile is available from and

Friday, 8 January 2016

Mobile printing for Apple-heads - review

I'm sure I'm not alone in finding that I'm increasingly doing stuff on mobile devices - in my case iPhone and iPad - that I used to do on a desktop computer. If, for instance, I just want to make a quick entry on a simple spreadsheet, it can often be easier to pull it up on a mobile device. And as long the document is one where the screen size isn't too much of an issue, it generally works very well - unless I want to print something.

Of course, you can print from iPhones and iPads if your printers support their AirPrint standard - but, inevitably, neither of mine do. (Printer aficionados might spot that my laser printer is around 15 years old.) However, I can now merrily print from the iPhone and iPad anywhere in the house, thanks to a nifty little package called Printopia.

This isn't an app - I use the standard 'send to printer' option from the mobile device. But Printopia is a cunning little add-on that sits on my Mac, which brings up my printers (and optionally various other destinations, like sending the print to Evernote or the computer) as if they were the real thing.

There's a free trial version, but it's one of the few add-ons I have willingly paid for, because it is solid and it just works. Once it's installed, it's controlled from the standard Mac settings panel (shown above) - but you rarely have to do this, because Printopia is transparent - it just lurks in the background and pretends to be AirPrint printers.

Of course, because the add-on runs on a Mac you have to have a Mac on the same network as your printers and it has to be powered up (though not active - Printopia works fine if the Mac's asleep).

I know 'people who use iPhones/iPads and have a Mac on the same network as their printers' is probably a relatively small cross-section of the world. But for those of us who fit in this niche, Printopia's well worth a try.

Thursday, 7 January 2016

You don't have to be a sadist, but...

I was recently doing an interview about science and science fiction to support my recent book Ten Billion Tomorrows and along the way I had to explain that it's not surprising that fiction (science or otherwise) usually has bad things happening in it, because without problems and challenges, there's not much of a story.

This is something that is well-known in writing circles. Almost all fiction can be summarised as 'Obstacle arises or something terrible happens. Protagonist tries to deal with it.' He or she may, or may not succeed, but without those problems, there really isn't a story. 'People have a nice time,' simply doesn't work.

But what I haven't seen discussed before is what effect this constant need to make characters suffer has on the writer. As writers, we either want to, or have to, write. (For many writers it seems to be the latter. We don't seem to have much choice in the matter.) And to write fiction we have to put our characters in difficult positions and make them suffer. So does that make us virtual sadists, getting a kick out of the suffering of our invented personas?

To date, I have published two novels. The first,  a young adult SF novel, Xenostorm Rising, has a protagonist whose parents go into hiding, leaving him to fend for himself. YA authors frequently have to find ways to rip teenagers from the safety of their families, to give them the kind of independence that the Famous Five achieved simply by having a scientist uncle who didn't give a damn. Without that, the YA protagonists can't really do their stuff - but it does put the author in an unenviable position of having to either kill off or remove parents (or make them truly horrible), always a delicate task.

In the second of my novels, the murder mystery A Lonely Height, my central character, Capel, gets off pretty lightly (don't worry, he will suffer significantly more angst in the second book in the series), but of course it's inherent in a murder mystery that a fair number of the characters will be having a horrible time, and it's rare that the detective gets away with Midsomer Murders style domestic bliss for long. (I really wish that simpering wife would do something truly evil.)

Realistically, whenever we venture into fiction, the chances are that one or more characters is going to suffer. So, while you don't have to be a sadist to be a fiction author, it arguably makes the process of writing that bit more enjoyable if you are.

Wednesday, 6 January 2016

Is objectification always a bad thing?

It's all too easy to take a widely accepted statement as a universal truth. But if we are to be thinking people, rather than knee-jerk puppets, we should always question anything presented as such a truth. So, for instance, we all know it's wrong to objectify people. But is it really? Is there any scientific basis for this assumption, or is it based on 'common sense' that objectification is inherently a bad thing?

The reason I bring it up is two recent discussions on Facebook. One was on the matter of Poldark, the BBC TV series. As usually seems to be the case with this programme, the main topic was the body of the actor playing the eponymous Mr P. This is surely just as much objectification as the old, thankfully departed Page 3 girls in the Sun, and my immediate reaction was to condemn it. But I really couldn't, because it was hard to see what harm was being done. If the women involved had been making the remarks directly to the actor, then it certainly would have been inappropriate, but is a touch of objectification really such a bad thing when it comes to images (still or moving)? After all, is n'tthat what we do whenever we produce a photograph or a painting with a person in it? Surely we shouldn't be banning all representation of people?

The second controversial discussion was one I started, having seen a couple of men haplessly attempting and failing to select a packet of nappies in a supermarket. I said 'It may be sexist, but still funny watching men in supermarket, unable to decide which pack of nappies to buy.' I got the response below:

Leaving aside the fact that it was a joke, not a 'liberal trope' (what's liberal about it, anyway? Liberals think men make great parents), in effect when we engage a stereotype like this we are once again objectifying. And certainly some stereotypes can be misused. But they are also very useful shorthand communication tools, and personally I think, in this instance, totally justified.

Is all objectification acceptable? Probably not. But has the pendulum swung too far against it? I suspect so.

Tuesday, 5 January 2016

Documentary downer

It used to be ever so middle class to deny watching much television. About the only things it was acceptable to say that you viewed were the news, plays and documentaries. It was almost a mark of being educated that you liked documentaries. But, personally speaking, I have real problems with them. In general, documentaries bore me.

This can be a bit embarrassing when someone says 'Did you see Horizon on quantum physics?' or 'Did you see that latest David Attenborough?' Because I won't have done. I've never successfully watched a full episode of a David Attenborough documentary. Admittedly it's partly because wildlife films are rarely about science, but I think the main problem is that I'm too word-oriented. I enjoy good story-telling TV, but I find that factual programmes manage to take about two pages of text and stretch it into an hour's worth of documentary. I'd much rather read the two pages. (This is also why I can't be bothered with the YouTube videos people are always saying I should watch.)

So I wasn't the ideal person for someone near and dear to persuade to watch the Netflix-streamed documentary Cowspiracy. Apparently this anti-animal farming documentary is turning people into vegetarians in droves. I must admit, my immediate response to it was a strong urge to go and get a hamburger, but I'm perverse like that.

First the good news about it. It made a couple of decent points that would have made up a whole page in a book on the subject. It is ridiculous that swathes of the Amazon rainforest are being cut down to raise cattle. And American levels of beef consumption are ridiculous. And it's true that most green protest groups ignore the issue. The filmmaker showed lots of green movement representatives looking embarrassed when he brought the subject up. This was presented as being because they are funded by agribusiness (and that may be true with some US groups). But to me it came across more as embarrassment because they didn't want to admit their ignorance. Given green groups' knee-jerk response to nuclear power, it doesn't surprise me at all that they ignore farming as not having the right image for their campaigns.

But. A lot of the rest of the documentary had me shouting 'That's not true!' at the screen. (Sadly, given the subject, I didn't think at the time to mutter 'Bullshit.')

I think the biggest problem with Cowspiracy was that it was totally Americas-centric. I don't think they interviewed anyone who wasn't from America, and it was all done from the viewpoint of the pretty much unique American approach to agriculture, plus their vast meat consumption (nearly twice as much as Europeans) - but their statistics were then scaled up as if it represented how the rest of the world would become. Most hilarious in this respect were comments on dairy, given the fact significant chunks of the world can't even consume it, as they don't have the appropriate gene.

So, for instance, we had pompous American 'experts' telling us that wherever you can raise animals, you could do better raising crops. I'd like to see what crops they would grow on the Welsh mountains instead of having sheep eat grass. There were also long (long) swathes of the film about water consumption, telling us how much water we use to raise a pound of beef (all in gallons, of course). But they didn't point out a) how much this varies (I don't think those sheep are given much water) and b) how almost all the water 'consumed' in raising animals is rapidly released back into the wild. If a cow contained all the water they said was used in raising it, it would be the size of a skyscraper.

Of course America uses lots of water inappropriately for agriculture (strangely, the documentary didn't say how, for example, the almond industry, used to make make milk for the vegans the film praised, was one of the worst examples) - but it's misleading to suggest that somehow raising animals consumes vast amounts of water in some kind of permanent way. It leads to short term issues, particularly in regions of the US that aren't naturally water rich, but not to long-term global problems.

There were without doubt seriously dubious 'facts' in play. We were repeatedly told that animal agriculture produced 51% of greenhouse gasses (as CO2 equivalent). This was stated as 'fact'. Yet the figure comes from a single, non-peer reviewed paper, where the scientific consensus is around 18% - cherry picking at its worst. And even that figure has been called into question with 10% being a more likely amount. Similarly, the ratios of useful plant weight to useful meat weight in the film are wildly inaccurate. As George Monbiot points out 'If pigs are fed on residues and waste, and cattle on straw, stovers and grass from fallows and rangelands – food for which humans don't compete – meat becomes a very efficient means of food production. Even though it is tilted by the profligate use of grain in rich countries, the global average conversion ratio of useful plant food to useful meat is not the 5:1 or 10:1 cited by almost everyone, but less than 2:1. If we stopped feeding edible grain to animals, we could still produce around half the current global meat supply with no loss to human nutrition: in fact it's a significant net gain.'

However, the biggest problem was an either/or attitude. It's the same logical fallacy you often see used by Intelligent Design creationists. They say 'if mechanism A can't explain a particular biological feature then it must have been a designer. But it just means A is wrong, not B is right. Here it was 'the way Americans eat and raise meat is unsustainable.' True. So the world must stop eating meat. Which simply doesn't follow in any logical fashion.

Most hilarious were a set of images towards the end where the documentary maker 'saved' a chicken from being killed and fed assorted cows. Yet his entire argument up to this point was we shouldn't raise animals. So he should have been killing them, not 'saving' them. All emotion, no reasoning.

As you might gather, I wasn't convinced by Cowspiracy, though it did emphasise just how bad the American situation is. But more than that, it made me surer than ever that I am not the right audience for a documentary.

This has been a green heretic production.

Monday, 4 January 2016

Magical fantasy - The Watchmaker of Filigree Street - review

I'm not a great fan of the dominant 'swords and sorcery' arm of fantasy (think Game of Thrones), but I love real world fantasies, where a fantasy element creeps into an otherwise ordinary world - the kind of thing authors like Ray Bradbury, Gene Wolfe and Neil Gaiman excel at. The Watchmaker of Filigree Street promised to be just such a book, and I was not disappointed.

Set in Victorian England against a backdrop of terrorist activities by Irish nationalists, the book (presumably due to author Natasha Pulley's personal experience) unusually mixes English and Japanese cultures of the time. What is striking about the book, reminiscent of one of my favourite books, The Night Circus, is a sense of the fantastical and magic in the air. The very solid and steam-driven world of Victorian England is set against both the Japanese village (where Sullivan conducts the first performance of the Mikado) and the exotic clockwork creations of the eponymous watchmaker.

It's in these remarkable constructions, from a clockwork octopus to a watch with a form of clockwork GPS that Pulley's imagination beautifully runs riot. While these creations are certainly fantasy, in the sense of being far beyond the realistic capabilities of clockwork, they are gorgeously conceived, and fit well with the enigmatic character of the Japanese nobleman-come-watchmaker who has surprising mental abilities. Pulley also has a genuinely interesting central character in telegraph operator and failed pianist Thaniel Steepleton, and the first two acts of the book manages to combine this wonderful touch of fantasy with a very engaging storyline.

There are a few issues. The final act sags somewhat, partly because it is so complex, which means it takes a lot of untangling, and partly because of the disappointing approach to the main female character. This is a male-dominated book, so it was good to have a strong female character who was a physicist, especially one who is attempting the Michelson-Morley experiment 3 years before it actually took place (though several years after Michelson devised the interferometer she is using). However, she is not handled very well by the writer, especially in her bizarre activities during that final act. (And she's not a very good physicist, as she regards a null result from the experiment as a bad thing, rather than the fascinating thing it really was, and that any real physicist would have considered it to be.)

Yet despite not being up to The Night Circus on overall performance, this is an impressive first novel and well worth reading if you like this kind of fantasy - one of my favourite fantasy books of 2015. I'd certainly be queueing up to buy a sequel.

The Watchmaker of Filigree Street is available from and