- On “the programming language conundrum”
- The old-ish (as of 1984) documentation on Xanadu
- Note: Really trippy! Nomenclature includes words like “poomfilade” etc 🙂
- Some thoughts on the “death of Prolog”
- A fascinating look at Eta, which bills itself as “Haskell for the JVM”
- A peek into the Sony Playstation 5
- A course in systems using Rust and WebAssembly
- A great “war story” involving subtle tweaks to the Go runtime
- With the caveat that … this has been seen before: ”Unix considered harmful”
Original source here
There is another school which has not been very well represented in the literature over the years, but which has undoubtedly produced a greater positive impact on the economy. Call it the “Fortran school of programming,” which I think is well summarized by Dr. Adam Rosenberg, the self-described last buffalo of industrial mathematics. Rather than viewing mathematics as an advanced tool reserved for extremely specialized computer applications, Fortran-school programmers view the computer as an advanced tool for doing mathematics. Historically, Fortran-school programmers have tended to work in industry or in the more technically inclined parts of the government (NASA, Defense, etc.). They are often mocked by Lisp programmers for their ignorance of recursion. (For a satirical essay, see Real Programmers Don’t Use PASCAL; if you think this satire is unfair, see Dr. Rosenberg’s Style Guide.)
The mockery, I think, ought to run with at least as much force in the opposite direction. In contrast to the Fortran tradition (which proudly “sent a man to the moon” and implemented critical infrastructure in banking, communications, and so on), the culture of Lisp is almost willfully ignorant of mathematics. This ignorance is disguised by all the talk of formalism and the instinctive genuflection before the Lambda Calculus, which, not unlike the Summa Theologica, is a closed computational universe that sheds little light on the observed world.
The trouble with the Lisp-hacker tradition is that it is overly focused on the problem of programming — compilers, abstraction, editors, and so forth — rather than the problems outside the programmer’s cubicle. I conjecture that the Lisp-school essayists — Raymond, Graham, and Yegge — have not “needed mathematics” because they spend their time worrying about how to make code more abstract. This kind of thinking may lead to compact, powerful code bases, but in the language of economics, there is an opportunity cost. If you’re thinking about your code, you’re not thinking about the world outside, and the equations that might best describe it.
Although the early years in the twenty-first century seem to be favoring the Lisp-school philosophy, I predict the balance of the century will belong to the Fortran-school programmers who are able to successfully apply mathematics to practical problems. It is tempting to declare that most programming problems “don’t need math”, but this is only true in the same sense that manufacturing, or supply-chain management, or baseball, “doesn’t need math”: advanced mathematics seems completely unnecessary to existing practitioners, but only until someone figures out that a particular mathematical concept is the right way to think about the problem at hand. After that, it is vital.
There are two reasons I am optimistic about the future of mathematics in computer programming. The first has to do with the growth in the amount of data generated by web companies (“Big Data”). With more types of data at hand, there are more equations that might be applied with utility. There is a lot of interest in advanced machine-learning techniques for this reason, but even simple statistical techniques might have prove to have at least as many applications. Mathematics applied to business data will be yield better business insights, more efficient operations, better products (e.g. recommendations), and new products (e.g. prediction services).
The second reason I am optimistic about the place of mathematics in computer programming is related: the average consumer has more data than ever before, and mathematics can help to make sense of it, or at least make it more beautiful. Application areas that were traditionally considered to be “scientific computing” (for example, Geographic Information Systems or image-processing) are now of interest to regular people who own (say) a collection of geotagged digital photographs. Instagram, for example, was built on a few equations that operated on an image’s color channels. An understanding of mathematics can help the programmer solve practical problems for users as well as provide a more pleasing experience. (To this end, you might enjoy my previous essay, Winkel Tripel Warping Trouble.)
Mathematics, in the end, does not help you understand computer programming. It is not about finding metaphors, or understanding “fundamentals” that will never be applied. Rather, mathematics is a tool for understanding phenomena in the world: the motion of the planets, the patterns in data, the perception of color, or any of a myriad things in the world that might be understood better by manipulating equations. It is the hacker’s job to figure out how to encode the insight into a piece of code that will be used over and over.
Should we return to the good old days when men programmed Fortran and everything was an array? Hardly. What we need is an infusion of applied mathematics into hacker education. Hackers raised on Lisp-school essays have grown up with only one parent. (The other parent has apparently been too busy at work.) We need examples, tutorials, and war stories wherein non-trivial mathematics are applied with success in computer programs. Although braggadocio doesn’t come naturally to most computer programmers, we need hackers to toot the horn of triumph whenever a new and interesting application of mathematics is found. We need to celebrate the spirit of scientific curiosity.
Lastly, we need the next generation of aspiring hackers to incorporate mathematics into their program of self-study. We need college students to take classes in physics, engineering, linear algebra, statistics, calculus, and numerical computing, and we need them to educate their elders who grew up ignorant of these things. With the relentless proliferation of data, and the impending extinction of the Fortran-slinging old guard, there are vast opportunities for budding mathematical hackers to make a difference in the world simply by thinking about it in a more rigorous way.
From a comment on Slashdot:
I swtiched jobs from being a computer programmer to being an ESL teacher in Japan. Japan is somewhat famous for churning out students who know a lot about English, but can’t order a drink at Mac Donald’s. We used to have a name for those kinds of people with regard to programming languages: language laywers. They can answer any question you put to them about a programming language, but couldn’t program to save their life. These people often make it past job interviews easily, but then turn out to be huge disappointments when they actually get down to work. I’ve read a lot about this problem, but the more I look at it, the more I realise that these disabled programmers are just like my students. They have a vocabulary of 5000 words, know every grammar rule in the book but just can’t speak.
My current theory is that programming is quite literally writing. The vast majority of programming is not conceptually difficult (contrary to what a lot of people would have you believe). We only make it difficult because we suck at writing. The vast majority of programmers aren’t fluent, and don’t even have a desire to be fluent. They don’t read other people’s code. They don’t recognise or use idioms. They don’t think in the programming language. Most code sucks because we have the fluency equivalent of 3 year olds trying to write a novel. And so our programs are needlessly complex.
Those programmers with a “spark” are programmers who have an innate talent for the language. Or they are people who have read and read and read code. Or both. We teach programming wrong. We teach it the way Japanese teachers have been teaching English. We teach about programming and expect that students will spontaneously learn to write from this collection of facts.
In language acquisition there is a hypothesis called the “Input Hypothesis”. It states that all language acquisition comes from “comprehensible input”. That is, if you hear or read language that you can understand based on what you already know and from context, you will acquire it. Explanation does not help you acquire language. I believe the same is true of programming. We should be immersing students in good code. We should be burying them in idiom after idiom after idiom, allowing them to acquire the ability to program without explanation.
This release of Is Parallel Programming Hard, And, If So, What Can You Do About It? features filled-out data-structures, debugging, and advanced-synchronization chapters; the addition of hazard pointers to the deferred-processing chapter; two new cartoons; and random updates to a number of other chapters. It also includes contributions from Angela Demke Brown, Darren Hart, Ralf Moeller, Charles-François Natali, Borislav Petkov, Anatol Pomozov, Alexey Roytman, and Frederic Weisbecker.
Our industry, the global programming community, is fashion-driven to a degree that would embarrass haute couture designers from New York to Paris. We’re slaves to fashion. Fashion dictates the programming languages people study in school, the languages employers hire for, the languages that get to be in books on shelves. A naive outsider might wonder if the quality of a language matters a little, just a teeny bit at least, but in the real world fashion trumps all.