According to the assistant director of research at Cambridge University's computing lab, modern gadgets are endangering the survival of computing geekdom.
When I was lad, my first computing experience was punching cards by hand, which we sent off by post to a computer in London, and then about 10 days later you got back a piece of paper saying you had made a mistake. Like many of my generation that were quite heavily into computing I started off with the Basic programming language, which came built-in with any self-respecting home computer. This was certainly true of my first home computer, the Commodore 64 (illustrated). Although I did professional work in NELIAC (don't ask), APL and C, my DIY computing moved on to Visual Basic - again something usable without a lot of expertise, but producing real Windows programs. (In my opinion this peaked with VB 3, in terms of having enough usability without getting too technical for an amateur.)
But now, the argument from Cambridge goes, equipped with their iPhones and inpenetrable Windows or Mac OS computers, the ordinary teenager hasn't a hope. To make matters worse, this 'sealed' technology makes it hard to understand what's going on inside, so we've got technically illiterate people using the technology.
I can see Dr Harle's point, and it would be good if people had a better understanding of technology, but I'm not sure his argument provides the whole picture. Even back when I was programming professionally I didn't really have much of a grasp of hardware, and didn't much care how it worked as long as I could get it to do what I wanted. And those modern environments make the computers (and in the end that's what an iPhone is) much more usable for the actual customers. So I'm not sure I can argue against them.
What's more, the non-techie person who wants to produce something of their own has a whole new development world in blogs and websites, a world that to me is more wonderful than the ability to program something to play Space Invaders or print "Hello World" repeatedly on the screen.
I don't think things are so bad after all.
When I was lad, my first computing experience was punching cards by hand, which we sent off by post to a computer in London, and then about 10 days later you got back a piece of paper saying you had made a mistake. Like many of my generation that were quite heavily into computing I started off with the Basic programming language, which came built-in with any self-respecting home computer. This was certainly true of my first home computer, the Commodore 64 (illustrated). Although I did professional work in NELIAC (don't ask), APL and C, my DIY computing moved on to Visual Basic - again something usable without a lot of expertise, but producing real Windows programs. (In my opinion this peaked with VB 3, in terms of having enough usability without getting too technical for an amateur.)
But now, the argument from Cambridge goes, equipped with their iPhones and inpenetrable Windows or Mac OS computers, the ordinary teenager hasn't a hope. To make matters worse, this 'sealed' technology makes it hard to understand what's going on inside, so we've got technically illiterate people using the technology.
I can see Dr Harle's point, and it would be good if people had a better understanding of technology, but I'm not sure his argument provides the whole picture. Even back when I was programming professionally I didn't really have much of a grasp of hardware, and didn't much care how it worked as long as I could get it to do what I wanted. And those modern environments make the computers (and in the end that's what an iPhone is) much more usable for the actual customers. So I'm not sure I can argue against them.
What's more, the non-techie person who wants to produce something of their own has a whole new development world in blogs and websites, a world that to me is more wonderful than the ability to program something to play Space Invaders or print "Hello World" repeatedly on the screen.
I don't think things are so bad after all.
A lot of kids didn't care about programming. They just wanted to play games. Programmable computers like the C64 and the Spectrum went out of fashion when the Sega Megadrive arrived.
ReplyDeleteYour hardware comment is quite true. What gets abstracted from the user or programmer shifts ever upwards. You often don't even have to install a printer driver now. You just plug it into the USB port, it detects it, loads the driver and is up and running.
I doubt this comment will be coherant, but I'll try.
ReplyDeleteA couple of things strike me:
1) There will always be people who want to know how things work. People who ask "but how?" instinctively, and to these people there will always be the draw of digging deeper. I don't think the level of abstraction at which people interact with devices will stop these people wanting to dig deeper.
2) Abstraction gives you power and this, in some ways, may lead MORE people into discovering how things work. As an example, I recently showed someone how they could create a Wordpress blog post that contained an embedded video and which automatically announced the existence of the post on Twitter and Facebook.
This is power! And it gets people asking "What else can I do?", and the curious might take that as an excuse to learn how to code a simple plug-in or a theme. Further down the line, that person (in exceptional circumstances) may get as far as contributing to the WordPress source!
Being able to print "Hello World!" and then goto 10 doesn't achieve much (other than annoying the staff in Dixons!), but these days with a relatively small amount of work you can create something amazing and useful, and this could lead people into development.
3) And yes, I completely agree about how much help there is for non-techies. With so much reference material, tutorials, and help online, it's much easier to learn.
4) But there is an issue with complexity too. For my A' Level Computing I wrote a simple database with graphical front-end in Pascal. The only tool/language I had to learn was Pascal and the compiler.
I don't know modern curriculums but today I would probably be expected to do this as a web application, and I would have to learn how to use a database, a scripting language, a web server, HTML, possible some JavaScript and XML too.
In some ways things are much easier. In other ways, they are far more complicated.