The Magic of Lisp

The Magic of Lisp

Cast your mind back to 1958. The world was still reeling from the impacts of World War II, television had just surpassed radio as the number one entertainment platform, and NASA had only just been founded that year. This was the environment in which LISP, an abbreviation for List Processor, was first conceived by John McCarthy. Originally developed as a language primarily for AI computation (which at the time largely consisted of search algorithms rather than the neural networks we know today), Lisp's influence grew in the 70s and 80s, being used for anything from workstation operating systems (see the Lisp Machine) to text editor configuration (see Emacs).

Its aptitude for complex computational tasks was largely down to its far-advanced representations for data structures, pioneering the use of directed acyclical graphs (DAGs) in computer programs. This came in the form of the cons cell, a data structure that had two components: a CAR (Contents of the Address part of Register) and a CDR (Contents of the Decrement part of Register). Crucially, this data structure could hold not just atomic data structures such as integers and booleans but also other cons cells, allowing for any DAG to be represented using this system.

(cons 1 2) ; a pair

(cons 1 (cons 2 (cons 3 nil))) ; a linked list

; a binary tree – just trust me on this one

(con 1 (cons (cons 2 (cons 4 nil) (cons 5 nil))) (cons 3 (cons (cons 6 nil) (cons 7 nil)))))

This simple abstraction gave Lisp the ability to more easily represent a large array of tasks, including searches, optimisation algorithms, and scheduling algorithms. By comparison, Fortran, a competitor of the time, only allowed programmers to manipulate static memory locations, making these tasks much more difficult.

Additionally, Lisp allowed the recursive call of procedures, using it as the language's primary looping construct, ahead of other constructs used for looping such as the GOTO or the DO loop. Some may argue that this is an inherent design flaw within Lisp, citing the use of stack space that procedure calls use. However, later revelations such as tail call optimisation meant that, given certain conditions, procedure calls can act just as a GOTO would. This relies on the fact that, when calling a procedure on the last line in a procedure's body, the current procedure's stackframe can be used when calling the next procedure, ensuring that additional stack space will not be consumed and avoiding the dreaded stack overflow that was characteristic of recursion.

Here is a function to compute factorials in Fortran 77, using the latest 1977 programming technology: functions that return values.

This is its Scheme equivalent. Scheme is a dialect of Lisp developed at MIT released in 1975 and was the language for which tail call optimisation was first implemented. Note that Scheme (and Lisps in general) use Polish notation – 1 + 1 is written as (+ 1 1). This will be explained later.

As shown, iteration can be expressed as tail-call-optimising recursion, with initial variables, a condition, and some way to update those variables. All iterative functions can be written with recursion with little loss in readability but the same cannot necessarily be said the other way around: recursive traversal algorithms in particular are much more readable than their iterative counterparts.

Lisp also had one more element of abstraction that made it a powerful language for working with data structures: higher-order functions. In Lisp, right from its earliest implementations, functions can be used as arguments to other functions and as return values. This provides a way to parameterise computation as well as plain atomic values and leads to very terse but readable code.

This is a Lisp 1 program to initialise a list of elements, add 1 to each and print the result.

Despite being a programming language feature introduced in 1958, the programming world would largely forget about higher-order functions for decades, only being resurfaced relatively recently with language updates such as Java 8 bringing this feature to modern mainstream languages. Now, it's difficult to imagine a programming language today without them. If nothing else, it prompts thought about which other brilliant language features have been lost to time, among them many from Lisp dialects such as Scheme's continuations or Common Lisp's condition/restart system.

But perhaps Lisp's most distinctive feature is its syntax. It draws ire from programming students, who lament Lisp's indulgence for parentheses and its prefix notation seeming as though it was dreamt up by aliens. However, it's within Lisp's syntax, and in particular its minimalism, that provides Lisp with such power. The fundamental premise is this: if Lisp is written in terms of lists, then all of its powerful features for manipulating lists can be used on Lisp programs itself. That's the eureka moment: each of Lisp's expressions are just lists that you can manipulate like any other. If the function always comes first in the list, then we know the structure of any expression with little parsing to do on the programmer's end. The facility that we have for manipulating expressions, the macro, is just a function that takes arguments and returns back a new list.

Imagine if Lisp didn't have any boolean operators. While other languages would naturally suffer from such a fundamental omission, Lisp programmers could trivially define their own in terms of the if statement. 

The benefit of these macros over traditional functions is twofold: arguments are not resolved unless forcefully evaluated, and the expansion of these macros happens at compile time rather than runtime, saving runtime cost and allowing errors generated by macros to be thrown at compile time, beneficial for macros implementing features such as type systems.

And so, from the 1950s we return to 2024. That was a nice talk about Lisp, but programmers like getting things done. Back we go to JavaScript. But, enchanted by our little history lesson, we decided to do a bit of research on JavaScript's history. It was invented by language designer Brendan Eich, hired by Netscape to add a scripting language to their Netscape Navigator browser. And the early iterations of JavaScript…

Scheme. 

It was Scheme.

Lisp has influenced programming more than you think.