Growing up in the 1970s, a common depiction of computers, at least by Hollywood film-makers, was a room full of wardrobe sized cabinets, with innumerable flashing lights and switches and spinning tape drives. Frequently, the plot of the movie was centred on the computer malfunctioning in some malevolent way, and chaos breaking out. The Italian Job, Westworld and 2001 A Space Odyssey all relied on computer failures to spice up the plot.
The reality was that throughout the 1960s, computer hardware was large and cumbersome, covered in lights and switches, and you only had to point a camera at any mainframe of that era, and you had the perfect cheap shot to enhance any sci-fi movie.
By the early 1980s, computers had entered homes and schools, and teenagers, such as myself at that time, were spending countless hours typing BASIC program listings from computer magazines, into low spec home computers, with dubious keyboards and scungy TV displays. In just 5 years, the first micros of the mid 1970s had evolved from homebrew rack systems with lights and switches, to mass produced machines on sale in every high street. In homage to the early micros, the classic example of that era, was the MITS Altair 8800, which had a role in the 1983 film "War Games".
At about the same time that Matthew Broderick was pretending to program the Altair 8800, I had invested in an 8 bit input output card for my ZX81. At about £20, and solder it yourself, it had cost me several weeks pocket money. Here for the first time, for me, was the means to connect to the real world and make the computer do something under program control. The first thing was to connect 8 LEDs to the pins of the output port and with some simple BASIC programs, create a whole load of different "Knight Rider" LED chaser display effects.
Fast forward 30 years, and technology has developed by several orders of magnitude. The average Smartphone now has 1 million times as much RAM as my first ZX81 and is clocked at 1000 times the speed. Not to mention that it has four 32 bit ARM processors acting as a quad array, plus a specialist graphics processor unit. This type of processing power and memory storage capacity in your pocket was inconceivable just a few years ago.
However, just because millions carry around this sort of processing capability in their pockets or bags, does not mean that we have become a population of computer scientists. In fact, the average citizen has no more inkling today of how a computer works, than 30 or 40 years ago, watching computers malfunction on Hollywood movies.
Fortunately, in recent years there has been an active and vocal campaign to help educate a tiny percentage of the population into the workings of computer technology. The $20 Arduino and it's spin offs, has done more for teaching computing science, hardware, firmware and programming skills than any other device in the last 30 years.
So, in homage to those early homebrew computers with their flashing LED displays, I have connected a dozen LEDs to the pins of an Arduino, and recreated some of my light chaser memories from 30 years ago.
I have a new toy to play with, and a programming language SIMPL that lends itself to simple interactive physical computing. A few lines of SIMPL and I have a LED chaser display that demonstrates the benefits of having an interactive programming environment.
1 comment:
Spending hours typing BASIC programs from a Magazine is exactly how I got into computers, programming and now electronics.
Post a Comment