Learn the Old Languages
20 September 2024 15 Minutes History Languages LearningNew languages are hip, old languages are erudite. Don't neglect these languages as you round out your skills.
There's a lot of interest in programming languages anymore. I think there always was, but it's become easier over the years for one to make their own language, and it seems like they have. Some of these languages address real-world (if often niche) problems, others are more frivolous. Most probably don't have a great reason for existing other than as a fun project for those involved. Which ones are worth learning? Well, probably not a lot of them. Most of them are interesting projects for small groups of people, though impractical outside an academic interest. A lot of them feel very samey.
Some of them, to be sure, are worth learning. However, even the new languages in this category don't have a lot of staying power for me. Rust is probably the most useful new language to learn, not just because there's a fair amount done with it now but because it does require that you express your ideas in a different way. Rust is getting well-mature and has a fair community, but it might be one of only two or three from this crop of languages that will stick.
I think that learning and using a variety of languages is an important practice. Knowledge for its own sake is good, but in this case it offers the very tangible benefit that it improves your thinking and enlightens new ways to consider problems. Knowing multiple (useful) languages makes you more valuable to more companies too - given two otherwise similar candidates, if I'm hiring for a firm which has legacy code in a more esoteric language I'm inclined to hire the candidate with a longer list of languages under their belt, even if the particular language in question isn't listed - it shows an increased engagement with programming.
So if you haven't yet, I would encourage looking back at some of the older languages. Sure, you might have opened up the Wikipedia article for Smalltalk and gawked at its syntax a couple of times, but have you built any OO-suited systems with it? Maybe you had to submit a few Discrete Structures assignments in Prolog in college, but did you ever try to implement real-world algorithms with it? These older, more forgotten languages offer the same benefit in expanding your programming skills while having the distinct advantage of practicality. To be clear, Delphi is nowhere near as practical as C#, but there's sure still legacy code written in it, and as it happens those languages were created by the same guy - just maybe there's some insights to be gained there.
Here's a small set of languages I'd encourage you learn. Not just to gloss over the syntax, but to sit down and build with. Personal side projects of course - though there are some proper professional uses for these yet, it's probably most advisable to the majority of folks to leave these for fun. The skills you gain by going deep with these languages can translate in real, practical, and sometimes surprising ways into the languages we typically use professionally.
Smalltalk #
Smalltalk is a neat language, though if you've spoken with anyone who has actually worked in Smalltalk before you might get the sense that these folks are just about inclined to split the years on the calendar into "Before Smalltalk" and "After Smalltalk". So many of them love this language. This is decidedly a language firmly situated in its own time, so you might not find you're able to gain quite that affinity for it today, but it nonetheless has a surprising ability to win over its programmers.
Smalltalk was, as you might well know, the first object-oriented language. Everything in the language is an object, and you pass messages instead of making function calls. As Smalltalk initiated the paradigm, that's technically how we're supposed to think about the Java or C# world, but in those languages it ends up being a distinction without a difference - particularly in languages which attempt to blend functional and object-oriented styles like Scala or C#. Not so in Smalltalk: everything reinforces and builds on the paradigm.
A number of those folks with an affinity for Smalltalk insist that to this day it is still the only object-oriented language. Whereas Smalltalk has an extremely light syntax laser-focused on its paradigm, the other languages have more robust - maybe "bloated" - syntaxes which, if not encouraging you to program against the paradigm, do at least make it more difficult to stay focused. If you spend a reasonable amount of time with the language, you might run into a similar feeling I did. It takes a little bit of effort to get to the point that you "click" with the style of program that Smalltalk encourages, and when I got there I remember thinking "golly, this is so different to object-oriented programming!" The kicker of course is that this paradigm is (arguably) the One True object-oriented one, and divergences I make in other languages are maybe something a bit different.
So I do think it will give you a great foundation of OOP in understanding how to craft a "pure" OO implementation, and I think that goes a long way to teaching what the fundamental goals of the paradigm are. Sure, we all had to read some paragraph about the paradigms in college, but that doesn't mean a whole lot in the face of finger-on-keyboard experience.
The other envy you'll come away with is the live debugging ability - you're able to change programs as they're running and see those effects in real-time. If that's news to you then you might be surprised that that's true for an object-oriented language - isn't that supposed to be the thing that functional languages are good at? This is another brick in the "Smalltalk is the only object-oriented language" wall - its message-sending focus allows the various components of the code to be decoupled so as to allow this. It's a great programming experience, though seeing as Smalltalk isn't used anymore I will confess it did serve to bias me towards functional languages which offer this capability today.
Speaking of its use today, it's not used! At least, I can't really find a lot of documentation towards industry use. In fact, with as loved and useful as the language was, it's surprising that it didn't last that long. I don't have firsthand experience with this, but my understanding is that a fair amount of the legacy Smalltalk has already been rewritten. If you have knowledge about this, please do leave a comment! I think if you're pining for a high-paying job at a bank rewriting a legacy system (as every child dreams for when they grow up) you're probably still better off elsewhere. However, I put the following to you: if you go into an interview there's a nonzero chance yet you'll find yourself across the table from an experienced colleague with an affinity for Smalltalk, and I guarantee you'll be benefited there!
Prolog #
Prolog is one of my favorites! I've not hidden that one of my two degrees is Philosophy, and when I was a student I focused quite a bit on logic. In fact, I was able to do a lot of my philosophy with prolog, which might be a bit of a head-scratcher if you haven't encountered logic programming before. This is a separate paradigm where you write programs by declaring logical statements of facts about your program. The computation is then done by an interpreter applying problem solving algorithms to resolve the relations of facts in the program.
This is a decidedly different way of thinking about a program - one which you are unlikely to use developing most professional software. However, this skill of writing programs by description translates directly to a lot of tasks. Any database work maps 1-to-1, and it's the same part of your brain you need to engage when writing tests to reason about the system. Modern pattern matching algorithms are essentially doing the same thing as a prolog interpreter, just on a different level, so if nothing else you might be able to break out some slick pattern matching moves!
This style of programming is quite influential in the field of AI, where it was largely employed back in the first AI boom in the eighties. It was also used, and to an extent still is, in natural language processing, data querying, and theorem proving. Is there a lot of that sort of work being done at our normal 9-to-5 jobs? Unlikely. Is it the cool thing to be programming though? Absolutely, as it has been for some time. The style of programming supported by Prolog cuts through a lot of busy work and can get you going quite fast if you're looking for projects in these areas.
One of the coolest ways to engage with Prolog though is to write your own interpreter for it. Heck, if you're so inclined it's somewhat trivial to write a parser for it. I know I introduced this article suggesting something other than making your own language, but I've had to sit through enough so-called experts tell me that Prolog isn't a real language so arguably this doesn't count. Being serious though, a parser and interpreter for it took me about 2 weeks of on-my-own-time effort, and that's really not bad. It was even fun ... towards the end!
Now while that project might teach you the finer details of SLD resolution, the primary emphasis I want to make is on how logic programming is beneficial, and for that you'll need to roll up your sleeves and dive into Prolog. However, there is an alternative: if you're more database or just data-inclined, then Datalog might be more your speed. It's a subset of Prolog that acts as a query language for databases. This use itself is a distinct paradigm for working with data, and its applications to more traditional SQL are more directly obvious.
Prolog is one of the primary recommendations I make when I'm asked for "different" languages to learn, not just because it's so different from our normal bread and butter, but because the skills you gain from it apply in a lot of smaller but consequential ways. I think this gives Prologgers (I have no clue if that's what we're supposed to call ourselves) a leg up all around. Maybe it's the single best "old" language to learn to really round out a programmer. That is, in my humble opinion.
Fortran #
Yes, I do mean the Fortran created in 1958 that originally used punchcards, though you needn't use that old of a version! Would it surprise you to learn that its most recent version was released in 2023? Would it surprise you even more to learn that it still has a roadmap? You can load it up today in VS Code, intall dependencies from a package manager, and develop some really serious programs with it. Indeed, it's not just a use_able_ language, it is still used (I think more than the previous two).
The applications for this language are sciency and mathy, to use technical terms. Being the low-level language it is, its uses are in that sort of computing that researchers run on supercomputers and the like. To support this, Fortran has strong support for array operations, efficient memory usage, and parallel computing. As such, today it's running practical computations in weather prediction, fluid dynamics, biology and chemistry simulations, and engineering modeling.
Being one of (maybe the) first science-focused language, a lot of its concepts provide the foundation for our newer languages in the same field, such as Python or R. As far as Fortran's practicability (i.e. "can I build a simple web API with it?") I do genuinely get the sense that it is a lower-level R: they both have packages and fair documentation for doing modern application development. I'm unsure if either is able to be used, easily, for mobile application development. However, there are frameworks for each to support desktop and web application development, which helps greatly in eliminating any gap between the research and productizing (for lack of a better term) work that might need to be done on such projects.
The skills I think you're most likely to pick up with this language are varied, and all at a low level. High-performance computing (HPC) is an obvious answer, but unless you've got easy access to a supercomputer that part might remain theoretical. Instead, practically, I'd suggest memory management and parallel computing. Sure, you can get a really good foundation in memory management with C, but I would suggest that Fortran's straightforward syntax is a better instructor on this subject. Parallel computing tends to become a more abstract concept the higher-level your language is, while I think Fortran would give you a relatively easy tool to explore that subject at the lowest level.
I started toying with Fortran a year ago thinking it would be funny to be able to suggest creating microservices in the language at work, and I was surprised that I enjoyed the language so much. Today it's quite fashionable to use functional languages for the more mathy services that we need to write at our jobs. That used to be the domain of C or C++, but those have slightly fallen out of favor due both to the popularity of functional languages as well as concerns about the seemingly high presence of footguns in those languages. I'd suggest here that Fortran is a viable candidate to consider as well for these tasks - some computation is very well-served by the functional paradigm, but plenty of algorithms are best represented imperatively.
COBOL #
Do not learn this.
Forth #
Forth seems to have a perpetual, ebbing popularity - I see a bit of interest in it rise maybe every 5 years. That could just be me, but I think there's a good casual appetite for the language out there. Just like each of the previous languages, Forth operates in a separate paradigm to the rest. All programs in Forth run on a stack which holds all of the program data. Data is manipulated by popping it from and pushing results onto the stack, chaining stack operations together into a program.
This has a couple of distinct advantages beyond solving the problem that we sometimes fight over what name a variable should have. The paradigm necessarily restricts programs to expressing concepts that map quite well directly onto the CPU operations while allowing higher levels of expression in the program code than other low-level languages. As such, it naturally rns quite fast with efficient memory use. The stack paradigm is particularly well-suited to state machines and expresses mathematical operations very neatly and composably. These facts all make it a great choice for firmware, robotics, and other sorts of embedded systems.
Unsurprisingly, it is still used for these tasks but quite rarely. There are plenty of newer stack-based languages as well, and those are maybe a bit more common in the toolbelt of the engineers working on these systems. That said, I have found (with admittedly limited experience) that all of these stack-based languages have the same core feeling, and Forth has excellent extensibility and pretty fair documentation, so I've stuck with this one.
Like the logic programming paradigm of Prolog, this stack paradigm is one that can be replicated in any other programming environment. If your application has discrete, one-off requirements that can be served best with a stack-based system, you can surely find a good number of libraries in your language providing these capabilities without needing to include a Forth executable alongside your application. The best way to learn how to use this paradigm though, I suggest, is to properly learn a stack language.
Being able to think primarily in terms of the stack has a lot of benefits on the side as well. Notably, I think most engineers use or have encountered Java or C# before, and the virtual machines which run each of these languages operate on a stack-based instruction set. There are plenty of ways that stacks are used creatively to boot, so I think it's safe to suggest that a robust understanding of them tends to pay off in the long term regardless.
As a final note here, I'll mention also that Forth is particularly well-regarded for its extensibility - it's very easy to add new words to the language and develop domain-specific languages with them. There was a major push in favor of DSLs in the last decade which doesn't seem to have taken off, but in the abstract their concept applies to all the programming we do. When we create a library or implement a standard for a layer in any application, we're defining a DSL of sorts which is defined by the constraints and standards we develop for that codebase. There are a few languages which are particularly biased towards this DSL-style thinking, of which Forth is one. This is a completely separate sort of thinking than stack thinking, though nonetheless hugely beneficial for any programmer. Though, I wonder if that's a separate article in the future!