Skip to main content

Are modern gadgets denying us the geeks of the future?

According to the assistant director of research at Cambridge University's computing lab, modern gadgets are endangering the survival of computing geekdom.

When I was lad, my first computing experience was punching cards by hand, which we sent off by post to a computer in London, and then about 10 days later you got back a piece of paper saying you had made a mistake. Like many of my generation that were quite heavily into computing I started off with the Basic programming language, which came built-in with any self-respecting home computer. This was certainly true of my first home computer, the Commodore 64 (illustrated). Although I did professional work in NELIAC (don't ask), APL and C, my DIY computing moved on to Visual Basic - again something usable without a lot of expertise, but producing real Windows programs. (In my opinion this peaked with VB 3, in terms of having enough usability without getting too technical for an amateur.)

But now, the argument from Cambridge goes, equipped with their iPhones and inpenetrable Windows or Mac OS computers, the ordinary teenager hasn't a hope. To make matters worse, this 'sealed' technology makes it hard to understand what's going on inside, so we've got technically illiterate people using the technology.

I can see Dr Harle's point, and it would be good if people had a better understanding of technology, but I'm not sure his argument provides the whole picture. Even back when I was programming professionally I didn't really have much of a grasp of hardware, and didn't much care how it worked as long as I could get it to do what I wanted. And those modern environments make the computers (and in the end that's what an iPhone is) much more usable for the actual customers. So I'm not sure I can argue against them.

What's more, the non-techie person who wants to produce something of their own has a whole new development world in blogs and websites, a world that to me is more wonderful than the ability to program something to play Space Invaders or print "Hello World" repeatedly on the screen.

I don't think things are so bad after all.

Comments

  1. A lot of kids didn't care about programming. They just wanted to play games. Programmable computers like the C64 and the Spectrum went out of fashion when the Sega Megadrive arrived.

    Your hardware comment is quite true. What gets abstracted from the user or programmer shifts ever upwards. You often don't even have to install a printer driver now. You just plug it into the USB port, it detects it, loads the driver and is up and running.

    ReplyDelete
  2. I doubt this comment will be coherant, but I'll try.

    A couple of things strike me:

    1) There will always be people who want to know how things work. People who ask "but how?" instinctively, and to these people there will always be the draw of digging deeper. I don't think the level of abstraction at which people interact with devices will stop these people wanting to dig deeper.

    2) Abstraction gives you power and this, in some ways, may lead MORE people into discovering how things work. As an example, I recently showed someone how they could create a Wordpress blog post that contained an embedded video and which automatically announced the existence of the post on Twitter and Facebook.

    This is power! And it gets people asking "What else can I do?", and the curious might take that as an excuse to learn how to code a simple plug-in or a theme. Further down the line, that person (in exceptional circumstances) may get as far as contributing to the WordPress source!

    Being able to print "Hello World!" and then goto 10 doesn't achieve much (other than annoying the staff in Dixons!), but these days with a relatively small amount of work you can create something amazing and useful, and this could lead people into development.

    3) And yes, I completely agree about how much help there is for non-techies. With so much reference material, tutorials, and help online, it's much easier to learn.

    4) But there is an issue with complexity too. For my A' Level Computing I wrote a simple database with graphical front-end in Pascal. The only tool/language I had to learn was Pascal and the compiler.

    I don't know modern curriculums but today I would probably be expected to do this as a web application, and I would have to learn how to use a database, a scripting language, a web server, HTML, possible some JavaScript and XML too.

    In some ways things are much easier. In other ways, they are far more complicated.

    ReplyDelete

Post a Comment

Popular posts from this blog

Why I hate opera

If I'm honest, the title of this post is an exaggeration to make a point. I don't really hate opera. There are a couple of operas - notably Monteverdi's Incoranazione di Poppea and Purcell's Dido & Aeneas - that I quite like. But what I do find truly sickening is the reverence with which opera is treated, as if it were some particularly great art form. Nowhere was this more obvious than in ITV's recent gut-wrenchingly awful series Pop Star to Opera Star , where the likes of Alan Tichmarsh treated the real opera singers as if they were fragile pieces on Antiques Roadshow, and the music as if it were a gift of the gods. In my opinion - and I know not everyone agrees - opera is: Mediocre music Melodramatic plots Amateurishly hammy acting A forced and unpleasant singing style Ridiculously over-supported by public funds I won't even bother to go into any detail on the plots and the acting - this is just self-evident. But the other aspects need some ex

Is 5x3 the same as 3x5?

The Internet has gone mildly bonkers over a child in America who was marked down in a test because when asked to work out 5x3 by repeated addition he/she used 5+5+5 instead of 3+3+3+3+3. Those who support the teacher say that 5x3 means 'five lots of 3' where the complainants say that 'times' is commutative (reversible) so the distinction is meaningless as 5x3 and 3x5 are indistinguishable. It's certainly true that not all mathematical operations are commutative. I think we are all comfortable that 5-3 is not the same as 3-5.  However. This not true of multiplication (of numbers). And so if there is to be any distinction, it has to be in the use of English to interpret the 'x' sign. Unfortunately, even here there is no logical way of coming up with a definitive answer. I suspect most primary school teachers would expands 'times' as 'lots of' as mentioned above. So we get 5 x 3 as '5 lots of 3'. Unfortunately that only wor

Which idiot came up with percentage-based gradient signs

Rant warning: the contents of this post could sound like something produced by UKIP. I wish to make it clear that I do not in any way support or endorse that political party. In fact it gives me the creeps. Once upon a time, the signs for a steep hill on British roads displayed the gradient in a simple, easy-to-understand form. If the hill went up, say, one yard for every three yards forward it said '1 in 3'. Then some bureaucrat came along and decided that it would be a good idea to state the slope as a percentage. So now the sign for (say) a 1 in 10 slope says 10% (I think). That 'I think' is because the percentage-based slope is so unnatural. There are two ways we conventionally measure slopes. Either on X/Y coordiates (as in 1 in 4) or using degrees - say at a 15° angle. We don't measure them in percentages. It's easy to visualize a 1 in 3 slope, or a 30 degree angle. Much less obvious what a 33.333 recurring percent slope is. And what's a 100% slope