Well, someone also has to not die from small-cell lung cancer to give the disease its 6 percent survival rate, but would you smoke four packs a day with the specific intention of being in that 6 percent? No, because that’s stupid. Well, tenure-track positions in my field have about 150 applicants each. Multiply that 0.6 percent chance of getting any given job by the 10 or so appropriate positions in the entire world, and you have about that same 6 percent chance of “success.” If you wouldn’t bet your life on such ludicrous odds, then why would you bet your livelihood?

An old article, on the original Sims game in the 1990s. Scroll down to where he talks about what attributes people chose for themselves when given the freedom to do so.

The most popular zodiac sign was Cancer, the most common family size was 2, the most common gender was male (by a small margin), and the most common occupation was “military”, followed closely by “medicine” and “politics”.

A long, but relevant comment on language popularity

Original link

Complexity of implementation? Slowness? Lack of marketing? Unfamiliar syntax? No, none of these explain the lack of popularity.
I think you dismiss these factors too quickly. We need to remember that even seemingly little things can prevent adoption.

C and C++ have been running fast on cheap hardware longer than Lisp has. The reason Lisp has caught up in performance is because of (relatively) better CPU architecture.

Marketing played a part in Java becoming popular. I can point to a few reasons:

  • The language designers modeled it after C++, a language many developers were familiar with.
  • Sun sold it as being more multi-platform than C++.
  • It was “internet ready”, unlike C++. This was a HUGE factor. No other language that people knew of made this claim. If developers used C++ on the web it was in combination with Perl, and only on the server.
  • They touted its garbage collection capabilities. No more having to free memory–something C and C++ developers were concerned about. It looked and kind of felt like C++, but without the hassles. That’s how it was billed.
  • It ran in browsers as an applet, and was securely sandboxed. This was something no other language could do (initially). They caught the “internet fever” at just the right time with this message. Lots of people deployed applets. Eventually Microsoft came out with its own solution: ActiveX. Along with Flash, it effectively killed Java applets. Why? Java applets were problematic. They didn’t run well across browsers (partly by design–Microsoft and IE). Flash integrated multimedia more easily than Java did.

Nevertheless, people found that Java was more cross-platform than C or C++ on servers (though you had to be careful about your brand of JVM).
On the minus side, Java had its detractors regarding performance. I think this became less of a problem on the server. People “just threw hardware at it”, until it ran fast enough.

I listened to the Java evangelists for years. So I know what their developers were excited about. The main thing for them was the slogan “Write once, run anywhere”. They loved it. What sold Java to the enterprise was J2EE, an enterprise framework specification, which Sun implemented for Java. They built upon their installed base over time.
Sun showed corporations how using Java allowed customers to choose their hardware based on capability and price more easily, since “you don’t have to rewrite the app.” That was a key phrase–keep your software investment no matter what hardware and operating system you choose. They differentiated this against the Microsoft ecosystem. Nevermind that for many customers this didn’t always work out. There were JVMs that had incompatibilities with each other. Despite Sun’s claims that Java programs don’t crash the machine, I saw it do just that in its early days.
What mattered was the idea, nevermind the reality: perfect portability, and internet-ready. Corporate customers liked that. Programmers liked that it was “a better C++”. It took a language they were already familiar with and made it “better”. That’s what customers were after, and they thought only Java could deliver it.

One might ask if this is the case why didn’t Objective C take off? That’s a bit of a hard one for me. It was a favorite among NeXT developers. Ever since the early to mid-1990s it’s been implemented in GCC. My guess is there was no corporate support behind it by then. Objective C was initially marketed by a single company in the mid-1980s. It was never recognized as a standard. You had to buy it from the company that created it. Eventually it went out of business, I think, but it did effectively become open source. Today it comes on the Apple Mac, so it has corporate sponsorship now. The Mac’s popularity is growing, so perhaps this’ll translate into success for Objective C someday.

Corporate sponsorship of languages carries risks, but it can also encourage adoption. If the company that owns the language implodes, it takes the whole community down with it–I refer you to what happened to Smalltalk. Microsoft was just more successful with VB. I could tell you a story about the beginnings of BASIC’s popularity, but that would be getting off-topic.

I haven’t been keeping track of market share numbers, but if it’s the case that Java has overtaken VB, I can give a couple plausible reasons for it, the gist of it being Microsoft screwed VB developers. When MS initially came out with VB.Net it was a major departure from VB 6, the previous version. For most people VB.Net could not translate or run older VB programs. It would try to translate them, but major parts of applications had to be rewritten, because they took out support for some things the VB API used to do. They implemented things differently. They made programming in VB more complicated, so there was a significant learning curve to get up to speed.

C# initially got all of the advantages. Most of the early .Net documentation was written only for C#, not VB.Net. Most of the early .Net code demonstrations were done in C#, not VB.Net. C# had some cool features VB.Net didn’t get initially. If you were a VB developer you felt left out in the cold by MS. Some VB developers became C# developers, just to make the transition easier. There were endless debates on MS forums: “Should I learn VB.Net or C#?” There was uncertainty in the VB community about its future. Some migrated over to Java, figuring it was a skill they could invest in and not have to change like they did with VB.Net. So MS in effect “anti-marketed” VB.
You may remember there was a lot of buzz about XML some years ago. The main reason being that it was an “internet ready” meta-language for data. Are you noticing a theme here? “Internet ready” is key. If it’s not “designed for the internet” people will turn their nose up at it. Note this is a marketing message.
Re: XML
Microsoft actually did something slightly innovative: They invented the technology that enabled AJAX. They put the XMLHttpRequest() call into IE 10 years ago. They added built-in support for XSLT as well. OWA (Outlook Web Access) was one of the first major AJAX apps., before Google Maps. As you may remember, anything MS pushed got adopted by a lot of people back then.

Why did XML work? It was initially sold strictly as a data description format. I think, like Java, it kind of looked familiar to developers. Getting people to adopt HTML in the 1990s was a learning curve. What was the “killer app” for that? The web. You couldn’t be on the web unless you dealt with HTML. That was enough motivation for people to learn it. Once they learned HTML, they only had to learn a design construct to get XML. The same tagging syntax was used.

Using XML to do actual programming didn’t come along until later. People looked for a way to end the tedious repetition of code. Some people found a way to get XML to act as a meta-language. Like XML it was descriptive. Everything was labeled. It was “just text” so it was “internet ready” as well. It didn’t require a lot of thought either. None of the XML tools I’ve heard of have a sense of recursion, which is hard for a lot of people to understand.

Why has Ruby become popular? Two answers: Rails, and DHH, who knows how to market a product. Before Rails came along Ruby was as obscure as APL. Rails not only made Ruby “internet ready”, it added an extra dimension that web developers have been craving: instant gratification. The idea that you could design your web app. from your database–basically data-driven development–was like a dream come true. The same way that the internet drove people to learn HTML (an unfamiliar syntax), Rails has driven some developers to learn Ruby.
In order for a technology to coax people to learn it, despite it being unfamiliar, it has to have a “killer app”, which is something that’s compelling (answers an immediate desire or need), and is the exclusive province of that particular technology, at least for a little while. We’ve seen this again and again.

Perception is a big part of whether a technology takes off or not as well. How has Lisp been “marketed”? For generations, it seems, Lisp has been billed as an AI language. So if you’re interested in exploring how people think, or how a machine could think, this language was for you. How are people supposed to go from this to thinking, “Oh yeah, I could use that for my next web app.”?? In order for marketing to work you have to show people how the tool or system answers a need they have. It needs to be sold to them as something they can relate to. One of the problems with the Lisp community, so I hear, is this is a real challenge for them. Making Lisp something exciting and relevant seems to be as hard for them as trying to explain market economics or politics to a 4-year-old. The truth is it’s not too hard. As you point out, syntactically XML is like Lisp. I tried pointing this out in a blog post once, and I got a bunch of negative responses: “I HATE XML!! Why would I like Lisp any better if it’s like what I hate?!” Oh well… 😦

Something RMS talked about is years ago when Emacs was first being developed, someone tried introducing Lisp to non-programmers by making it part of Emacs, and not using the word “programming” at all. They merely said, “If you’d like to add convenient features to the editor, you need to describe your feature using these features and syntax.” It was successful. The key was, “Teach programming while people are doing something else.”

Don’t teach programming as an end in itself. Teach it as a means to an end that they want.

Here’s a hint if anyone wants to take it up. There’s an open source project out there called “Vista Smalltalk” that works with WPF, WPF/E, and Flash. It’s a way to create RIAs (Rich Internet Applications). It runs both Smalltalk and Lisp code. If anyone’s interested in evangelizing Lisp in a way that people can relate to, this is an opportunity to do it.

PG: How far will this flattening of data structures go? I can think of possibilities that shock even me, with my conscientiously broadened mind. Will we get rid of arrays, for example? After all, they’re just a subset of hash tables where the keys are vectors of integers.

SV: Get ready for a shock: Javascript already does this. Even worse: the integer keys are first converted to strings!

From this commentary on the hundred year language

The good news about Erlang can be summed up at this: Erlang is the culmination of twenty-five years of correct design decisions in the language and platform. Whenever I’ve wondered about how something in Erlang works, I have never been disappointed in the answer. I almost always leave with the impression that the designers did the “right thing.”

I suppose this is in contrast to Java, which does the pedantic thing, Perl, which does the kludgy thing, Ruby, which has two independent implementations of the wrong thing, and C, which doesn’t do anything.

Evan Miller