Skip to content
Matthew Adams By Matthew Adams Co-Founder
Which programming languages should I learn?

We were having a discussion about languages, prompted by a piece of technology strategy work we are doing around Node.js, and I came away with the impression that there is a general sense that we are moving from a period when the C-family of languages had a hegemony (from the mid-nineties onwards) into a period of fragmentation and diversity.

First, I'm going to challenge that notion with a bit of history, and then see where that perspective leaves us in the "language wars" of today.

When I was a baby developer, every single engineer was reasonably proficient in one of the popular Assembly Language families - typically Motorola's 680x0 or Intel's x86, but it could be Z80, 6502, ARM or something more esoteric (IBM Mainframes, anyone?)

Here's some nostalgic Z80 code. Ah, big-endian architectures. How I miss you.

; 99 Bottles of Beer program in Zilgo Z80 assembly language.
;  Assembles for ZX Spectrum/Timex - change Spectrum: lines 
;  if porting.  Help from Marko!  Compiled and tested with SPIN 
; 
; Adapted from the Alan deLespinasse's Intel 8086 version 
; 
; Author: Damien Guard ; damien@envytech.co.uk ; www.damieng.com

 org 32768

start:
 ld      a, 2                  ; Spectrum: channel 2 = "S" for screen
 call    $1601                 ; Spectrum: Select print channel using ROM

 ld c,99                       ; Number of bottles to start with


loopstart:
 call printc                   ; Print the number of bottles
 ld hl,line1                   ; Print the rest of the first line
 call printline

 call printc                   ; Print the number of bottles
 ld hl,line2_3                 ; Print rest of the 2nd and 3rd lines
 call printline

 dec c                         ; Take one bottle away
 call printc                   ; Print the number of bottles
 ld hl,line4                   ; Print the rest of the fourth line
 call printline

 ld a,c
 cp 0                          ; Out of beer bottles?
 jp nz,loopstart               ; If not, loop round again

 ret                           ; Return to BASIC


printc:                        ; Routine to print C register as ASCII decimal
 ld a,c
 call dtoa2d                   ; Split A register into D and E

 ld a,d                        ; Print first digit in D
 cp '0'                        ; Don't bother printing leading 0
 jr z,printc2
 rst 16                        ; Spectrum: Print the character in 'A'

printc2:
 ld a,e                        ; Print second digit in E
 rst 16                        ; Spectrum: Print the character in 'A'
 ret


printline:                     ; Routine to print out a line
 ld a,(hl)                     ; Get character to print
 cp '$'                        ; See if it '$' terminator
 jp z,printend                 ; We're done if it is
 rst 16                        ; Spectrum: Print the character in 'A'
 inc hl                        ; Move onto the next character
 jp printline                  ; Loop round

printend:
 ret


dtoa2d:                        ; Decimal to ASCII (2 digits only), in: A, out: DE
 ld d,'0'                      ; Starting from ASCII '0' 
 dec d                         ; Because we are inc'ing in the loop
 ld e,10                       ; Want base 10 please
 and a                         ; Clear carry flag

dtoa2dloop:
 inc d                         ; Increase the number of tens
 sub e                         ; Take away one unit of ten from A
 jr nc,dtoa2dloop              ; If A still hasn't gone negative, do another
 add a,e                       ; Decreased it too much, put it back
 add a,'0'                     ; Convert to ASCII
 ld e,a                        ; Stick remainder in E
 ret

; Data
line1:    defb ' bottles of beer on the wall,',13,'$'
line2_3:  defb ' bottles of beer,',13,'Take one down, pass it around,',13,'$'
line4:    defb ' bottles of beer on the wall.',13,13,'$'

Even then, C was starting to take a firm hold everywhere, while scientists were using FORTRAN, line-of-business devs were still knocking out COBOL (and various mainframe languages like APL and PL/I), and CompSci academics were using languages like ML, LISP and Haskell.

We lived in a world of profound language diversity, specialized to a particular use case. It is often perceived that people used "the right tool for the right job" - but I think the reality was somewhat different. As I said, everyone knew a bit of assembler. You had to if you wanted to be able debug things at the lowest level on your platforms of choice. But LOB developers knew COBOL, not ML. Scientists knew FORTRAN, not LISP. Language diversity was really programmer diversity.

A few years later, and C/C++ are becoming dominant, along with the amazingly successful and long-lived Visual Basic (nearly as ubiquitous as Excel as a populist programming tool). Then, along comes Java, and the rise of Perl and PHP, Python, C# (and VB.NET - a totally different language). More recently, JavaScript moves from being a poorly-understood SFX tool for web designers, to a mainstream language; and ML gives birth to a whole family like F#, Erlang and Scala.

Discover your Power BI Maturity Score by taking our FREE 5 minute quiz.

So where are we today? Well, judging by the TIOBE index, more than half of developers know at least one C-family language, be that C, C++, Java, C# or Objective-C.

Many developers are also (we are told) learning JavaScript - driven by the demand for richly interactive web applications (and more fancy SFX in standard websites), and the rise in interest in node.js. It is interesting to note, though, that on the TIOBE measure, it has had a year-on-year decline in demand and has fallen out of the top 10 languages (usurped by a resurgent Perl, and Ruby).

The Introduction to Rx.NET 2nd Edition (2024) Book, by Ian Griffiths & Lee Campbell, is now available to download for FREE.

Given the importance of the concepts embodied in node.js on the one hand, and the apparently insatiable industry demand for ever-more elaborate web pages on the other, why might this be so?

I think the answer is probably influenced by a risk/reward calculation. I mentioned this in the context of HTML5 a while ago: the tooling is poor, developer education is poor, the language is deceptive (it looks like C, but has much more in common with LISP1 and ML), the debugging experience is extremely poor (even, perhaps especially, in the world of node.js), and although there are many 3rd party libraries (just look at the 28,000-odd packages on NPM), they are riddled with incompatibilities, and even the base libraries supported across all implementations are barely fit for purpose.

As a CTO, I certainly wouldn't bet the farm on that kind of technology at this stage if I didn't have to. Of course, it is really interesting to work with, and if people don't work with it, it will never improve, but it is (clearly) not yet ready for mainstream adoption - Mort and Elvis are not in the building. If it is really the way forward, then how is this technology going to evolve to meet the constraints of broad adoption?

Programming C# 10 Book, by Ian Griffiths, published by O'Reilly Media, is now available to buy.

One way in which this evolution is happening is signposted by Node.js: the use of the JavaScript engine as (part of) a platform. A case in point is Microsoft's own support for (perhaps I'd go so strong as to say partial adoption of) Node.js with IIS/Azure as first class hosts. Thinking of the JavaScript engine as a platform frees us from JavaScript as a language, and we can start to look at CoffeeScript, TypeScript and others as a partial solution to the language complexity issue. But are such marginal languages (even such esoterica as F# and Erlang top them in the adoption stakes) really going to save JavaScript in the long run?

I'll push the boat out and say 'no'. In 10 years time, the principle behind Node will be with us, but the .JS bit will be, to all intents and purposes, gone. We may well still call that part of the browser runtime JavaScript and there will be plenty of classic JavaScript out in the wild, but the languages and tooling supported by it must evolve beyond recognition for it to challenge for the top-5 language proponents seem to think it already is.

Matthew Adams on Twitter


  1. Eric Lippert said on his blog in 2003: "Those of you who are familiar with more traditional functional languages, such as Lisp or Scheme, will recognize that functions in JScript are fundamentally the Lambda Calculus in fancy dress. (The august Waldemar Horwat -- who was at one time the lead Javascript developer at AOL-Time-Warner-Netscape -- once told me that he considered Javascript to be just another syntax for Common Lisp. I'm pretty sure he was being serious; Waldemar's a hard core language guy and a heck of a square dancer to boot.)"

Matthew Adams

Co-Founder

Matthew Adams

Matthew was CTO of a venture-backed technology start-up in the UK & US for 10 years, and is now the co-founder of endjin, which provides technology strategy, experience and development services to its customers who are seeking to take advantage of Microsoft Azure and the Cloud.