Wednesday, May 04, 2011

Reinventing the Wheel

It's a case of what goes around, comes around.

This post takes a retrospective look at some minimalist computing solutions which have been historically significant over the last 45 years, - effectively - my lifetime.

It's a quick look back at the last four decades of computer development, highlighting a few significant events, and how they have become intricately woven together to influence the modern world.

Can we take a lesson from earlier times to make interaction with modern systems, more intuitive and easier for newcomers to learn "real" computer programming.

Mid 1960's

Around about the same time as I was conceived (late '64), MIT labs and Raytheon Inc, were contracted to come up with a digital control and guidance computer that would be used throughout the Apollo moon-shot program. Thus was born the Apollo Guidance Computer or AGC

The AGC was assembled from approximately 5600 individual logic gates - a brief specification was given as follows:

Word Length: 15 bits plus parity.
Fixed Memory Registers: 36864 Words.
Erasable Memory Registers: 2048 Words.
Number of Normal Instructions: 34.
Number of Involuntary instructions (Increment, interrupt, etc.): 10.
Number of Interface Circuits: 227.
Memory Cycle time: 11.7 microseconds.
Addition Time: 23.4 microseconds.
Multiplication Time: 46.8 microseconds.
Number of Logic Gates: 5,600 (2,800 packages)
Volume: 0.97 cubic feet.
Weight: 70 pounds.
Power Consumption: 55 watts.
The AGC was used in conjunction with a display/keyboard unit or DSKY which weighed a further 17.5 lbs.

Despite its basic specification, the AGC was used on Apollo missions right up until 1972, including the first successful moon-landing of 22nd July 1969. If you want to get an impression of the AGC it is well documented on Wikipedia, including this rather superb replica.


The 1970's

Approximately 6 years after the first moon landing mission, a young programmer, Paul Allen was on his own important mission. On a flight to Albuquerque, New Mexico, he was to meet and deliver a roll of paper tape to MITS, makers of the new Altair 8800 microcomputer system.

The tape contained a BASIC interpreter program, written by Allen and his business partner, William Gates, and it was the first real job for their fledgling company Micro Soft.

Allen even had to write a paper tape reader routine in 8080 machine code whilst on the plane, as in the rush to get the job done, he had forgotten to include the necessary paper tape loader routines to get the code into the new Altair machine. The original implementation fitted into just 4K of program memory, but was later expanded to 8K so as to include extra functions including trigonometric maths. This was the first implementation of a usable high level language on a hobbyist microcomputer, and the outcome of that meeting was to change the computing world for ever. Paul Allen and Bill Gates went on to deliver variants of BASIC for most of the early 8 bit microprocessor families, and their company grew into global giant Microsoft.

Paul Allen had a vast experience of mainframes and effectively devised an 8080 emulator to run on a PDP- 10. It was with this software tool, that the young Bill Gates wrote much of the original 8080 BASIC interpreter, and delivered it working to the client at MITS without ever having seen an 8080 system! Gates was certainly a smart kid. The same mainframe emulator was used to produce BASIC interpreters for most of the popular 8 bit micros.


A few years later, as far as I recall, in the summer of '79, microcomputers eventually arrived in the UK. My school had an early Research Machines 380Z (A CP/M Z80 machine) with a cassette interface and very little else. Software was lacking, but there was a BASIC interpreter and a few games written in BASIC. One was the numerical game "lunar lander" which had been around on programmable calculators for a couple of years - where you had to try and landd the lunar module, or crash and burn on the moon's surface through lack of fuel. Later a copy of Space Invaders written in Z80 machine code turned up from somewhere. It was just like the arcade game and very popular. A particularly geeky mate of mine taught himself Z80 machine code, aged 14 - so that he could manipulate and hack the variables within the space invaders programme to his advantage. However, I hadn't a clue what machine code was, and so I wasn't to write my first bytes of raw Z80 hex for another 2 or 3 years.

The 1980's

In 1981 a far-sighted teacher brought a ZX81 into the classroom, and I learnt to program (badly) in BASIC. As a sixth-former I ran an after school computer club for a couple of years in my spare time, and generally found many similar excuses to flunk my A levels.

The photo on the right shows the corroded remains of my ZX81 built from a kit in the spring of '83. I fitted a 2K RAM and a much bigger heatsink. It was put in a custom case and used to control a Turtle robot. It ran for about 20 minutes at a time from C-size rechargeable NiCads. Sadly it has been in a damp trunk for the best part of 25 years - so perhaps not looking in the best of health. I dug this out tonight, dusted it down, and perhaps one day I will clean it up and get it running - though I notice the ROM chip is long since gone.


The ZX81 was a fairly minimalist machine, consisting of the Z80A clocked at 3.58MHz, a custom ULA (uncommitted logic array) from Ferranti, the 8K BASIC ROM and either a pair of 2102 512 byte RAM chips or a 1Kx8. I fitted a newly affordable 6116 2Kx8 and gave myself a machine with 5 times the previous codespace (768 bytes of the original 1024 were used by the display leaving a barely adequate 256). At university, a friend and I made a FM wireless link between two ZX81 machines from a FM "bug", and were able to transfer programs from one machine to the other over the FM link using the quirky audio cassette interface.


During Live Aid in July '85, I remember I spent the day in front of small small black and white portable TV set, watching the bands whilst hand wiring a prototype Z80 control card - a 4MHz Z80 maxxed out with ROM, RAM, 8255 Ports, PIO, SIO, CTC and keyboard interface.

With 6 chips on the board it was about 4" x 5" and a mass of bus wiring - my own personal homage to early computer construction! A picture of the Z80 control card is below. It was built to run an electronic synthesizer project - so has 5, 8 bit ports, a keyboard scanner and a Z80-SIO and Z80-CTC to provide MIDI serial and counter-timer functions. It is striking to think that this is a very much slower equivalent of the likes of the Arduino MEGA.

Hardware certainly was hard 25 years ago - I remember it took me most of a week to design, build, wire and debug that board. I hand coded a monitor ROM from a version I had cribbed from another Z80 machine which allowed hexadecimal program entry and text output to a terminal.

The hand-wired "verowire" prototype is shown below, showing the pink wiring held in place by nylon combs. Just wiring 3 x 40 pin and 3 x 28 pin chip sockets took a lot of connecting up. I guess it might still run code if I apply power! The white socket on the bottom was to allow me to plug it into an early development system called a Multitech "Micro Professor" MPF-1B a Taiwanese built Z80 micro trainer that I bought in Edinburgh in the spring of 1984. Multitech Industrial Corporation later became Acer - another historical link. The MPF-1B is actually still available from Flite Electronics - 27 years after I bought mine!


The 1990's

What began in the mid 70's with the introduction of the 8080 - the first real microprocessor IC, heralded a succession of cheaper and more powerful micros from all the main IC manufacturers. Fuelled by the PC industry, we saw hardware, clock speeds and memory increasing yearly with every new generation of Intel x86 processor. It's hard to believe that we take clock speeds in the low gigahertz, RAM memory in gigabytes and hard drive capacities in terabytes for granted these days.

At the same time that PCs were developing there were rapid developments in embedded microcontrollers. In the 90's I worked with 68000, 8051, Z8, PIC and TMS320 DSPs. We had entered the era of system on a chip, where microprocessor core, program memory, RAM and I/O peripherals were reduced down to a single IC. The rather clunky Z80 board I built in 1985 could be replaced with a single 40 pin IC runing at about 10 times the clock speed of the earlier design. Microcontrollers were given a massive boost with reprogrammable Flash memory for program storage, and on-chip analogue to digital converters and simple PWM allowed a certain amount of analogue processing to be done. I recall in the late 1990's developing a single chip telephone modem device based on a low cost PIC, where all DTMF and modem tones were synthesized and decoded in firmware.

Into the 2000's

Some 30 years after Paul Allen's world changing business deal, some clever guys, Massimo Banzi and David Cuartielles at the Interaction Design Institute, Ivrea, Italy, put together a simple microcontroller board and introduced the world to the Arduino and the world of open source hardware and software. The Arduino core development team now consists of Massimo Banzi, David Cuartielles, Tom Igoe, Gianluca Martino, David Mellis and Nicholas Zambetti.

Ironically, the ATmega AVR microcontroller which runs the Arduino is almost identical in specification to the AGC, the Altair and the ZX81. A small board costing around $15 has the same potential processing power as the AGC which cost $15 million back in 1965. The ATmega328 which runs the Arduino is a mere $2 to $3. How far we have come in 40 years!

2010 and Beyond

So this retrospective post is by way of considering whether we can re-purpose some of the methods and techniques devised by early computer engineers to make their machines useful with a minimum of resources?

The reason that these particular machines and events have special meaning, is not only that they have been milestones in the personal computing industry, but they also mark major milestones in my own life. I recall being woken up at 4am on July 22nd 1969 to watch grainy pictures on a black and white TV set, of Neil Armstrong's decent from the LEM onto the lunar surface.

The other connection between them, is that the specification of the computers involved was roughly the same in terms of memory and speed of operation - there really is not a lot of difference in processing power between the AGC, the Altair 8800 and the ZX81 - just 15 years and about 70lbs in weight.

So this got me thinking about these early machines, how they were programmed and what could be done with them. The Altair and the ZX81 shared a BASIC interpreter, numerous BASIC dialects arose out of public domain code originally developed by Dartmouth College in 1965, and written for a GE 225 mainframe.

If Paul Allen managed to shoehorn a workable BASIC interpreter into 4K of code back in '75, how about doing the same thing for the 32K program space on the ATmega328. Back in 2008, an AVR enthusiast Klaxon44, wrote a BASIC interpreter in C and successfully ported it to the ATmega128. Capable of some 50,000 empty loops per second, it wasn't lightning fast, but good enough to run interpreted commands from the serial link. Had the BASIC interpreter been coded directly into AVR machine code, and not in a round-about way involving compiled C, it would have been many times faster.

To anyone who grew up with the early home micros, the ability to edit programs on the fly, and type commands directly at the keyboard which would be executed immediately was a neat way of program interaction. Klaxon44's version allows direct manipulation of the ports, so you can write to the I/O directly for testing hardware such as LEDs and sensors directly.

We've come a long way, even since my early coding days on the ZX81. We now have self programming flash memory, and SPI devices which only need four signal connections to connect them. SD cards and wireless modules allow easy storage of large quantities of data and allow simple communication between processors. Imagine a simple BASIC interpreter running on an Arduino which could be used to introduce school kids to the basics of programming and controlling electronics with simple keywords, - without having to first learn embedded C!

BASIC is no substitute for C, but it is certainly easier to lhttp://www.blogger.com/img/blank.gifearn, debug and is more forgiving than C with it's finicky syntax and over-zealous use of punctuation marks and various types of http://www.blogger.com/img/blank.gifbrackets.

I don't want to re-invent the wheel, but is there not a case for a easier way to interact with and program embedded microcontrollers?

AS I write this, it seems that there is a new kid on the block - Raspberry Pi a tiny linux ARM based computer expected to sell for under $25.

A video of the Raspberry Pi and founder David Braben appears on the BBC on May 5th 2011.

Could this be the future of minimalist, easy to program, real computers?




No comments:

Post a Comment