I think one of the main consequences of the inventions of personal computing and the world wide Internet is that everyone gets to be a potential participant, and this means that we now have the entire bell curve of humanity trying to be part of the action. This will dilute good design (almost stamp it out) until mass education can help most people get more savvy about what the new medium is all about. (This is not a fast process). What we have now is not dissimilar to the pop music scene vs the developed music culture (the former has almost driven out the latter – and this is only possible where there is too little knowledge and taste to prevent it). Not a pretty sight.

– Alan Kay

If we know enough about the problem to prove its specification and solution correct, there is no longer any reason to work on it, and the solution to the problem should simply be published and taught. Actum ne agas. Move on to the next problem.

Today’s programmers, whose ‘hello world’ programs written in Java require the memory of millions of early 80s Sears Department Stores’ Electronic sections full of VCSs have heard stories of the amazing programming feats in the days of old. The Atari 2600 (code name: Stella) featured a whopping 128 bytes of RAM. Not 128M. Not 128K. 128 bytes. You can’t even fit a whole Twitter Tweet in there.

Lisp is like a religion. It was created by superstitious primitives long before I was born, asks people to have faith that all will be revealed in time, and is still waiting for the prophet that will lead everyone to paradise.

… the curse of macros: You cannot in general expect to understand fully what some code would compile to without being the compiler. In inferior languages, the code you write is probably the code the machine will run. In Common Lisp with lots of macros, the code you write is only data for lots of other programs before it becomes code for the compiler to arrange for the machine to run. It takes some getting used to.

I think anthropomorphism is worst of all. I have now seen programs “trying to do things”, “wanting to do things”, “believing things to be true”, “knowing things” etc. Don’t be so naive as to believe that this use of language is harmless. It invites the programmer to identify himself with the execution of the program and almost forces upon him the use of operational semantics.

EWD #854

The computer “user” isn’t a real person of flesh and blood, with passions and brains. No, he is a mythical figure, and not a very pleasant one either. A kind of mongrel with money but without taste, an ugly caricature that is very uninspiring to work for. He is, as a matter of fact, such an uninspiring idiot that his stupidity alone is a sufficient explanation for the ugliness of most computer systems. And oh! Is he uneducated! That is perhaps his most depressing characteristic. He is equally education-resistant as another equally mythical bore, “the average programmer”, whose solid stupidity is the greatest barrier to progress in programming. It is a sad thought that large sections of computing science are effectively paralyzed by the narrow-mindedness and other grotesque limitations with which a poor literature has endowed these influential mythical figures. (Computing science is not unique in inventing such paralyzing caricatures: universities all over the world are threatened by the invention of “the average student”, scientific publishing is severely hampered by the invention of “the innocent reader” and even “the poor reader”!)

EWD #618