EW@60: A Tower of Babel

Alun Williams tries to make sense of the Babel of programming languages that have emerged in the past 60 years and more.

What is the common noun for programming languages? A babel, surely. Collectively, it’s a ‘Tower of Babel’ built over more than 60 years.

The code that got the manned Apollo 11 mission to the Moon in 1969 is arguably the most important program ever written

In the beginning – we’re actually in the late 1940s, early 1950s – scientists had created the computer and saw that it was good. But it was bad to program. We’re talking binary numbers and front‑panel switches and punched cards, the stuff of legend. You had the massive, mysterious hardware in front of you and only a slice of time to control its details directly, manage the memory and pray for your program. The machine was the master, but this was to change.


Assemble

Enter assembly language and a shift in the balance of power with the evolution of so-called Second Generation languages. You were still intimately dependent on the underlying hardware, but more symbolic codes were used – mnemonics matching together low-level processes into ‘higher level steps’.


Before being run on the computer, this almost readable code would have to be assembled down into the ‘object’ code that would control the hardware. Hence the name. Each version of assembly language, however, was still specific to the underlying processor architecture and instruction set. They still required knowledge of the central processing unit, were difficult to write and almost as hard to spot programmatic mistakes in.

There were no prayers but a lot of hard work.

Symbolism

Over time, languages developed and became even more symbolic – with Lisp, Fortran and Cobol leading the way – abstracting the programmer further from the details of their hardware.

The best known of these languages – and we are now in the mid 1960s, in the early days of Electronics Weekly – would be the likes of Pascal and C and Basic which are commonly known as Third Generation languages. Power had shifted to the programmer. The computer was now the servant as the code became more machine-independent. The programmer was becoming the all-important creator and sculptor of computing intent.

So it was – until some time around the mid 1970s when languages began to proliferate, some attempting to be a ‘one size fits all’ solution, others addressing niches of particular domains.

Here comes Prolog, a logic language for knowledge-based systems, for example. And there’s Occam, a language for programming parallel processors (think XMOS Transputer), and Ada, aimed at military and safety applications, and MatLab (matrix manipulations, plotting of functions and data) and yet even more esoteric developments, such as Z, a language whose schemas could be formally verified for correctness, and so on. The Tower of Babel is already dizzyingly high.

And once upon a time, in the mere mid-rungs of the Tower, something called Object Oriented programming used to be the future, with the likes of Smalltalk and even C++, which incremented C. Whereas C could get down-and-dirty with low-level efficiency, C++ put on a shirt and tie. It developed and grew to encapsulate classes, inheritance and delegation and other strange object-based derivations.

4GL

Figure 1: The very model of a modern major IDE – layout inspection with Google’s Android Studio

By the early 1990s, however, the so‑called Fourth Generation languages (4GL) appeared on the scene, basically built around even higher levels of abstraction often with an underlying database to store data.

I remember names such as Brock, Bam and others, none of which could be truly considered successful. You may remember PowerBuilder, or Clarion, perhaps. (Wikipedia classifies NI’s LabView as a 4GL, surprisingly).

Instead, with Microsoft Windows now established on every desktop computer, languages such as Borland’s Delphi came along, incorporating libraries to help keep people in the graphical interface to which they were accustomed.

More visual code editors were now the thing, along with debugging support for more rapid, prototyping development.

Figure 2: Side-by-side analysis of thread activity in Android Studio

Indeed, C# later followed this period, with Microsoft attempting to bend the now dominant C++ to the True Windows Way.

And so, as more and more was expected of the professorial programmer, attention began to shift from the code itself to the context of its creation (and its design, testing, profiling, emulation, etc). Enter the all-singing, all-dancing integrated development environment (IDE). Syntax highlighting, auto‑completion and folding or indenting editors became mainstream.

Web

But by the late 1990s another, even more disrupting technology came along to change the landscape: the World Wide Web. Enter, from this time, a new floor in the tower, with a plethora of new languages and associated frameworks, such as Perl, Visual Basic.Net, Ruby, etc, as well as the omnipresent Java (we won’t talk about Javascript).

Figure 3: AGC assembler powered the Apollo 11 mission to the Moon in 1969

One language was to emerge pre‑eminent – a type‑safe, more friendly version of C++ called Python. With its dynamic typing, garbage‑collecting and support for object‑oriented and procedural programming it’s quite the entry. Something to be thankful for each time you program a Raspberry Pi or an Arduino.

But time doesn’t stop and programming languages didn’t cease to be created, offering new solutions to new problems as well as old ones. Would you now like to meet Dart, for example, and try its extension methods, or the parcelising methods of Kotlin, perhaps? And so it goes, and we come ever closer up to date. With SystemML and TensorFlow we are in the realm of learning libraries and frameworks even further removed from the programmer and accessed via the interface of Python.

But we should pause, to note a change in atmosphere at this height in the tower. Is power now shifting beyond the programmer, somehow back to the machine, at a higher level?

Figure 4: Some example Java code from our Build Your Own App series

Programmers still instruct and direct, they believe, but they can no longer know the answer. The machine will sometimes learn the answer for them. The code is still machine-independent but the programmer is becoming the servant of the computer again.

Machine learning

Arguably artificial intelligence (AI)- or machine learning (ML)‑based languages could be classed as the Fifth Generation (even though the term 5GL had already been allocated to earlier logic-based attempts, replacing algorithms with declarations of the problem domain).

Is AI really the same? Or should new AI-inflections be classed as Sixth Generation? It’s hard to tell when you are so high up. And we’ve now reached the misty clouds of the present day, where it’s hard to see further and things are confused.

This history has yet to be written. Of one thing you can be sure, however. Ever more entries will be added to this Tower of Babel, each seeking new solutions to the problems de nos jours. Always seeking and never quite finding. After all, no one language can ever be all things to all processors, let alone all problem domains.

 


Leave a Reply

Your email address will not be published. Required fields are marked *

*