There was a time when I was on the cutting edge of computer technology. I ran my first programs, written in an assembly-like language, on an IBM 1620. By the time I finished graduate school, Notre Dame had built a computer center to house its mammoth Univac. We fed it Fortran punch cards -- when it worked. My mobile phone now has more computing power than was housed in that building.
But we knew how the damn thing worked. Program counters, accumulators, ALUs, ANDS, ORS, NOTS, flip-flops, front-edge triggering, back-edge triggering, machine language, assemblers, compilers.
Then along came my Mac 128 and the guts of computers began to disappear behind a user-friendly interface. Still, we did a lot of our own programming, in BASIC.
Progress was breathtaking, with hardware and software advancing together at a speed that finally made what goes on inside the box irrelevant. And my brain got older. Computers got easier and easier to use, but there were more and more things I couldn't remember. And more and more of my life was evaporating into digital bits. I have piles and piles of floppy disks around the house, years of writing, now virtually unreadable. In the past three years I have written 156 Sunday essays on this blog, and a thousand daily posts. I know they exist out there somewhere, because we have access to them, but I haven't a clue where.
Meanwhile, without Tom of the young and nimble brain to keep me up and running I might as well return to pencil and the old black-bound journals I kept for years that have been read by no one but myself. They are sitting on the shelf and will be there when I die. God knows what will happen to all my words that exist only in ASCII code.