Skip to main content

Hey baby, get off my cloud

Sometimes the computing world comes up with a term that truly gets on my nerves, especially when it used portentously - few have succeeded as fully as 'the cloud' and 'cloud computing'.

Part of the problem I have with this is why 'the cloud' is necessary at all? We've already got the terms 'network' and 'server' that seem to work quite well. By comparison, 'the cloud' is more than a little nebulous. (Smug smile.)

However, it seems we're stuck with it. But there have been interesting signs lately of a split in the cloud. We now have (my terms) clean cloud and dirty cloud. Clean cloud is Google's vision. This puts pretty well everything on the net. (Sorry, in the cloud.) Your device will have a bit of storage for temporary work, but all your data and all your programs are online. When you want to do something you call up a cloud-based application and access your cloud-based data.

It is, in many ways, a consummation devoutly to be wished (literary reference). Wherever you are, whatever device you have access to, you can simply pick up where you left off. Want to give a Powerpoint presentation in a village hall in Norfolk using their computer? No worries. Need to do a quick edit of a key document using only the seatback video screen of a train? You've got it. Or on your phone, or your tablet, or grannie's TV... you never carry anything, never lose anything, always have everything available.

Leaving aside the trust issues of having all your essential data and programs in someone else's hands, this is a dream scenario with one proviso. A big, fat, juicy proviso. For Google's clean cloud to work you need access to the internet anywhere and everywhere, 24/7 without interruption. The moment you don't have high speed access to the net you are screwed.

Now in some rosy picture of the future where we have high speed wireless access anywhere, this is great, but realistically it is not the world we inhabit. Even the best internet connection is down occasionally, and away from wi-fi hotspots, 3G access is slowish at best - and fails regularly. Just try it on a train. Or in Aldbourne (a village near us where mobile phones rarely work at all).

So traditionally we've fallen back all the way to having everything on the PC with all the limitations that entails. But now, Apple is tempting us with a dirty cloud. Here, the software (an app) resides on your device, but new apps are easily downloadable whenever you have internet access, giving you flexibility. Your data is backed up in the cloud, and can be accessed directly from it when you have a connection, but key data is also held locally so you can work offline. Wherever and whenever you like. As soon as you get a connection, everything is synchronized.

The dirty cloud isn't perfect. There is less flexibility over swapping between devices, particularly if you want to work on someone else's hardware, though it's still entirely possible to swap between your PC, laptop, tablet and phone. Yet it seems the most practical compromise until we do have universal reliable wireless internet - something that seems a good number of years off. What's more, perhaps it's my age, but I'm more comfortable with apps on my device than programs floating in someone else's cloud.

We're already part way to the dirty cloud. A combination of apps and facilities like Evernote and Dropbox that allow wireless synchronization make the online/offline working feasible. Similarly, for example, an app like the Times newspaper requires a connection to download today's paper, but after that you can use it in the darkest, wirelessless (sic) tunnel. And Apple intends to fill out the dirty cloud later this year with its iCloud service which will provide much more automatic wireless sharing of data.

Now much though I love my portable Apple devices, I don't like the Apple closed shop - and with other innovations it has been a case of 'Anything Apple can do, Android can do soon after.' It will be interesting to see if this is the case with the dirty cloud. Because, of course, Android is Google's baby. Will Google throw the baby out with the bathwater by insisting on staying with its pristine but often unusable clean cloud, or will it join in and play dirty? Only time will tell.

Comments

Popular posts from this blog

Why I hate opera

If I'm honest, the title of this post is an exaggeration to make a point. I don't really hate opera. There are a couple of operas - notably Monteverdi's Incoranazione di Poppea and Purcell's Dido & Aeneas - that I quite like. But what I do find truly sickening is the reverence with which opera is treated, as if it were some particularly great art form. Nowhere was this more obvious than in ITV's recent gut-wrenchingly awful series Pop Star to Opera Star , where the likes of Alan Tichmarsh treated the real opera singers as if they were fragile pieces on Antiques Roadshow, and the music as if it were a gift of the gods. In my opinion - and I know not everyone agrees - opera is: Mediocre music Melodramatic plots Amateurishly hammy acting A forced and unpleasant singing style Ridiculously over-supported by public funds I won't even bother to go into any detail on the plots and the acting - this is just self-evident. But the other aspects need some ex

Is 5x3 the same as 3x5?

The Internet has gone mildly bonkers over a child in America who was marked down in a test because when asked to work out 5x3 by repeated addition he/she used 5+5+5 instead of 3+3+3+3+3. Those who support the teacher say that 5x3 means 'five lots of 3' where the complainants say that 'times' is commutative (reversible) so the distinction is meaningless as 5x3 and 3x5 are indistinguishable. It's certainly true that not all mathematical operations are commutative. I think we are all comfortable that 5-3 is not the same as 3-5.  However. This not true of multiplication (of numbers). And so if there is to be any distinction, it has to be in the use of English to interpret the 'x' sign. Unfortunately, even here there is no logical way of coming up with a definitive answer. I suspect most primary school teachers would expands 'times' as 'lots of' as mentioned above. So we get 5 x 3 as '5 lots of 3'. Unfortunately that only wor

Which idiot came up with percentage-based gradient signs

Rant warning: the contents of this post could sound like something produced by UKIP. I wish to make it clear that I do not in any way support or endorse that political party. In fact it gives me the creeps. Once upon a time, the signs for a steep hill on British roads displayed the gradient in a simple, easy-to-understand form. If the hill went up, say, one yard for every three yards forward it said '1 in 3'. Then some bureaucrat came along and decided that it would be a good idea to state the slope as a percentage. So now the sign for (say) a 1 in 10 slope says 10% (I think). That 'I think' is because the percentage-based slope is so unnatural. There are two ways we conventionally measure slopes. Either on X/Y coordiates (as in 1 in 4) or using degrees - say at a 15° angle. We don't measure them in percentages. It's easy to visualize a 1 in 3 slope, or a 30 degree angle. Much less obvious what a 33.333 recurring percent slope is. And what's a 100% slope