My brief history with computers: Part two

If your idea of editing text is vim and Microsoft word, then boy am I going to have a tough time trying to explain Wordstar to you. And yet, such were the times that (I imagine twenty years from one people will refer to today in the same way, which makes me anxious about what exactly is possibly equally dumb right now) it was a fairly popular piece of software! nearly universally used by “people without macs”, which was most people.

In a sense I suppose typing Ctrl-K-S to save isn’t all that different from typing Ctrl-X-S to save in Emacs, but this was all that regular people had to use, and I suspect most people were quite relieved to get a more “wysiwyg” application like Microsoft Word — which is of course the norm now.

The other thing that no one does anymore is have their own database software (indeed, you would be crazy to do so today). But I remember getting painfully acquainted with the minutiae of DBase(“in its day the most successful database management system for microcomputers”) — and later FoxPro and then Microsoft Access — though I never built anything more than a simple library application with it.

The only one that has survived in some recognizable form today is the venerable spreadsheet. Now Lotus 1-2-3 didn’t have any of the bells and whistles of today’s software (and the version I had was keyboard-only), but at least the notion of sheets with rows, columns and cells is still conceptually current.

After a couple of years, my dad got the “next big thing” — Windows!

This was version 3.11, if you’re interested, and it came on a huge bundle of floppy disks (about thirty of them!) which had to be patiently inserted one after the other as the system was copied over, but by bit. Explaining what a floppy disk will just make me feel old; go look it up.

So this was great. We now booted up, got the prompt, and then typed “win” which loaded up this fancy GUI. Mind you, I still didn’t have a mouse, which meant learning all the keyboard shortcuts for minimizing, maximizing, moving a window, and so on.

Agh this is already too long, but a small digression before I go: this is something that you just cannot be aware of today, when so many layers blend in together so seamlessly. In the old days, the fact that you were running programs inside a shell was very obvious, and you were aware that “the real computer was underneath” etc, which is unfortunately impossible with, say, a smartphone.

My brief history with computers: Part one

You see, you use these things to compute … stuff

I first encountered computers in books. Glossy books these were, the kind you would expect to find twenty years ago in the “reference books” section of a library. Which was how I came across them; my mother was a teacher and was able to get me books though I wasn’t in the same school.

These books were dated, of course, and so my initial impression was anachronistic to begin with. The story ended at these magical microprocessors, though at the time the first modern pipelined processors were coming into existence.

The first glimpse

The very first computer I saw was probably an 80486 with a color display, sometime around 1994 or so. It was some guy I knew at school, whose father had bought this for his elder brother, and at his birthday party all the assembled guests crowded around this curious device, as he let some sort of demo program run, showing images, wireframes, and so on, as people oohed and ached.

The very first computer I had was an 80386 my father bought in 1995. I remember it very clearly, having waited in eager anticipation of it for weeks since I came to know it was coming. I read the MS-DOS 6.22 manual cover to cover (yes, I know, sick) before it arrived.

My fitness tracker has more memory than …

Its specs were formidable. It had a 14” black and white monitor, a 256gb hard disk and 4mb of memory (yes, that’s four megabytes). All it had was Ms-Dos and QBasic.

Now it’s fair to say I probably picked up bad programming habits that I’m not even aware of; or at least that’s the standard impression people have about Basic. Either way, it was a blast. Because I did on that machine was programming!

The lonely prompt

Hang on, I don’t want to skip ahead. Let it sink in for a while. The only program I used was QBasic. No phones, no internet, no Windows either! You booted up the computer, and you stared at a “C:>” prompt! and you typed “qbasic” or whatever, and you were in this notepad like environment where you wrote stuff line by line, and your program was interpreted.

This state of affairs lasted about a year, after which it was supplemented by a few rudimentary rasterized graphics games, and then the trio of “Lotus 1-2-3”, “Wordstar” and “Dbase 4”. But more on that later …

Steve Yegge is always awesome.

I recently came across this excerpt:

Desperate or not, those people aren’t going to work for me. I demand excellence from my co-workers. The disease, nay, the virus of programming-language religion has a simple cure: you just write a compiler. Or an interpreter. One for any language other than the one you know best. It’s as easy as that. After you write a compiler (which, to be sure, is a nontrivial task, but if there’s some valid program out there that you couldn’t ever write, then you’re not justified in calling yourself a programmer), the disease simply vanishes. In fact, for weeks afterwards, you can’t look at your code without seeing right through it, with exactly the same sensation you get when you stare long enough at a random-dot stereogram: you see your code unfold into a beautiful parse tree, with scopes winding like vines through its branches, the leaves flowering into assembly language or bytecode.

When you write a compiler, you lose your innocence. It’s fun to be a shaman, knowing that typing the right conjuration will invoke the gods of the machine and produce what you hope is the right computation. Writing a compiler slays the deities, after which you can no longer work true magic. But what you lose in excitement, you gain in power: power over languages and over language-related tools. You’ll be able to master new languages rapidly and fearlessly. You may lose your blind faith, but you gain insight into the hauntingly beautiful machinery of your programs. For many, it deepens their real faith. Regardless, it lets them sit at the table with their peers as equals.

From here

Arguments over syntax aside, there is something to be said for recognizing that a loop that steps only one variable is pretty useless, in any programming language.

It is almost always the case that one variable is used to generate successive values while another is used to accumulate a result. If the loop syntax steps only the generating variable, then the accumulating variable must be stepped “manually” by using assignment statements.. .or some other side effect.

The multiple-variable do loop reflects an essential symmetry between generation and accumulation, allowing iteration to be expressed without explicit side effects.

Steele and Gabriel, “The Evolution of Lisp”

By studying a number of examples, we have come to the conclusion that programs in less expressive languages exhibit repeated occurrences of programming patterns, and that this pattern-oriented style is detrimental to the programming process.

Felleisen, “On the expressive power of programming languages”

But then I had a whimsical thought; the sort of thing that seems at once not-impossible and yet such a long shot that one can just relax and enjoy exploring it without feeling under pressure to produce a result in any particular timeframe (and yet, I have moved my thinking forward on this over the years, which keeps it interesting).

What if we could find a way to take advantage of the fact that our logic is embedded in a computational system, by somehow bleeding off the paradoxes into mere nontermination? So that they produce the anticlimax of functions that don’t terminate instead of the existential angst of inconsistent mathematical foundations ?