No general procedure for bug checks will do.

Now, I won’t just assert that, I’ll prove it to you.

I will prove that although you might work till you drop,

you cannot tell if computation will stop.

You can never find general mechanical means

for predicting the acts of computing machines;

it’s something that cannot be done. So we users

must find our own bugs. Our computers are losers!

Snooping the Loop Snooper

It’s natural for programmers to view the executable binary
generated from their programs through the prism of their source
code
. In that view, functions do not get jumped into sideways,
nor are they called from locations other than their explicit call
sites; variables retain their values unless assigned to by name or
by reference; assembly instructions cannot spring into existence
unless somehow implied by the code’s semantics; and so on.

(From HN) “What code is worth studying?”

My personal list (mostly imperative languages)

C++: (Complex software with elegance + performance )

  • Dart source code

  • V8 source code (Same people as Dart)

  • LevelDB

  • Chrome (the only downside: too much virtual dispatch -> “javism”)

C:

  • SQLite

  • Redis

  • Nginx

  • Solaris and Freebsd

Java:

  • Rich Hickey implementation of the clojure runtime in Java
    (it was there in 2009.. maybe now this is in clojure itself??)

Go:

  • The Go standard libraries

I think the craziest idea I have heard in the last few years is that everyone should learn to code. That is the most bizarre and regressive idea. There are good reasons why we don’t want everyone to learn nuclear physics, medicine or how financial markets work. Our entire modern project has been about delegating power over us to skilled people who want to do the work and be rewarded accordingly. I’m all for making us aware of how various technological infrastructures work. But the idea everyone should learn how to code is as plausible as saying that everyone should learn how to plumb. To me it just makes no sense.

… They are ripe for ridiculing because they are ridiculous in many cases, and the only reason they are advancing is because they plug in the conceptual and theoretical holes in their theories with buzzwords that have no meaning – “openness” or “the sharing economy” – what on earth is the sharing economy?

What I’ve tried to do in my reviews is engage seriously with these bullshit concepts, as if they were serious – to see whether an idea such as “cognitive surplus”, of which Clay Shirky is very fond, has any meaning at all. I do close readings of things that aren’t meant to be read very closely. That is how our technology discourse works, there are lots of great bloggers, soundbites and memes, but once you start putting them together you realise that they don’t add up. And making people aware that they don’t add up is a useful public function.