(This post originally appeared in February 2008.)
An illustration in last Sunday's New York Times evoked the pixelated pics of not so long ago, all jagged little squares, lines that looked more like staircases. This was intentional, of course. One can assume that any illustrator employed by the Times has up-to-date hardware and software graphics. The idea, I suppose, was to look retro. How quickly what we once thought was cutting edge became antique.
I sit here working at my laptop and if I stop to think about what's in front of me I can only shake my head in wonder. I was there in the 1950s and 1960s when the first commercial computers were coming on line. Not so many years later I was dragooned into teaching a two-semester course on how computers work because I was the only one on the faculty who had the knowledge. It was fun. We started with Boolean logic, then went on to the electronic expression of logic functions, flip-flops, edge-triggering, registers, ALUs, and so on. The culmination of the course was to break up into groups of four and build a working computer out of 7400 series integrated circuits -- clock, registers, instruction decoders, ALU, the works. They were simple machines, with an extremely limited instruction set, but they embodied all the elements of timing and control of a real machine. The students could step their way though a program -- front edge, back edge, front edge, back edge -- and watched data move around with their logic probes. I wish I had a photo of one of those machines, spread out on a board about a meter square. When the students finally debugged their creations and got them to work, they were inordinately proud -- and understood how computers work.
Time passed, and computer theory and practice raced far ahead of me. The college developed a department of computer science, and I retired from the front lines with my 7400 chips. No one builds a hands-on machine anymore; it's all theory. The current students know vastly more about computers than I will ever know, but I wonder if any of them have a clue about what's actually happening down there in the guts of the CPU with each beat of the clock.
I sit here thinking about what's going on inside this sweet little MacBook at 1.83 billion times a second, for hours on end, with never a missed beat, never a dropped bit, and it makes my head spin. I belong to the generation of the jagged pixels, dot-matrix printers, and 8-bit CPUs. And I remember with an aching fondness those long afternoons in the electronics lab when we huddled around a bench with a logic probe in our hands, watching the red and green LEDs flicker on and off as our handful of machine-code instructions were executed step -- by step -- by step -- by step.