Lisp Machine - Possible? Desirable?

Due to the shocking weather outside, I recently watched an old FOSDEM presentation on what we can learn from lost OS. It suggested one way to reform (intentional pun) the state of today’s bloated software would be to bring back environments such as Smalltalk and Lisp that in which the whole operating system and environment was written in their respective high level language which gave the power to edit anything on the fly, and allowed for tiny operating systems.

Naturally this sent me down a rabbit hole of reading about Lisp. As idle speculation to indulge somebody with no knowledge in this area, would a Lisp Reform require specialist hardware to interpret the language effectively, and how might one envision this could be achieved? Some sort of Field Programmable Gate Array?

I was particularly interested to read that one of the later developments was a European dialect of Lisp, EuLisp, that - in terms of sovereignty - would seem ideal for a European computer like Reform.


I know @minute hasn’t touched this in a long time, but it would be really cool to see something in the spirit of Interim OS brought up on the Reform. Maybe in my copious spare time™ I can start working on something like that.

1 Like

Lisp machine is certainly possible. There was a bootleg of the original Lisp Machine ported to some ancient Ubuntu version circulating at some point.

There is no need for FPGA to run Lisp, you can run Emacs on pretty much any system.

As to if it is desirable the short answer is, in my opinion, no.

The Lisp machine was not wonderful because it was running Lisp, the wonderful features had nothing to do with Lisp at all.

One feature that was often cited is that the machine would never ‘crash’ in the sense of throwing everything away and starting from 0 as is common with most OSes today. The Lisp interpreter would throw some sort of errors or exceptions, and if not handled you would get back to the debugger/shell/IDE where you could inspect the machine state, and continue execution of the program. This is common behavior for Lisp interpreters but is in no way a feature of the language itself. While this is not something that can be easily achieved with assembly or C programs pretty much any managed language runtime could do it.

Another feature is language uniformity of the system. While some low-level parts of the Lisp machine were written in assembly or C vast majority of the system was written in Lisp - applications, libraries, everything. The shell would be the IDE/debugger where you could evaluate Lisp expressions or run applications which would really be more complex Lisp expressions. If you installed a library it would be available to applications as well as the shell.

The reason why Lisp is not the best language is that language theory advanced since Lisp was invented.

To understand the problem an excursion into history is needed. First we start with the story of Mel to understand what are ‘separate constants’. Assembly language using symbolic names for instructions is a higher level language than machine code. Using this higher level language imposes some structure. Because the value used to encode the instruction is obscured by a symbolic name it is no longer possible to, say, multiply by an instruction that happens to be encoded as the value you want to multiply by, a ‘separate constant’ for the multiplier is needed.

And the bourne shell and other similar shells are the machine code of Unix. The data they operate on is completely unstructured. It’s all text. It can be interpreted as the name of the function(command) to call, as completely untyped argument, whatever.

There are many libraries that provide ‘tools’ that are conversion of their meticulously crafted API to this shapeless alphabet soup on which the shell operates. Of course, the experience is horrible for everyone involved. For the library writer finding ways to squeeze at least basic functionality into this alphabet soup interface, for the users using the resulting API that is inherently awkward and error-prone.

There is this bash-completion thing that is trying to add types to the alphabet soup API to help users to pass the right thing to the ‘tool’/shell function. Of course, because the completion is completely separated from the program itself it is always out-of-sync, glitchy, missing some values in enumerations, etc.

And Lisp is to shell what assembly is to machine code. It is slightly more structured. You no longer get shapeless alphabet soup, everything is an S-expression. Sexps are not typed in any way, you can pass any sexp to any function. However, they are no longer shapeless, you can pass your data structured in some way. Also unlike the unix shell the Lisp program is observable, good introspection features are the norm for Lisp. The types are, however, missing.

There is another thing that is missing in Lisp. Critical practitioners observe that while Lisp is the language of choice for the famous SICP book Lisp programs themselves lack structure. While it may be theoretically possible to write nicely structured programs in Lisp in practice any moderately complex project written in Lisp turns into an eldritch blob oozing parentheses on all sides with no structure to speak of. There is probably some lack of tools for modularity.

An example of a decent Lisp IDE is Emacs. It provides text editing, interpreting the text as e-lisp code, extending the IDE with code written in e-lisp. The fatal flaw is that e-lisp is not a general purpose language, it’s a language written specifically to run Emacs in and to extend Emacs with. You do not get general purpose libraries written in e-lisp. When you install ImageMagick you do not get to work with bitmap images in Emacs. Another fatal flaw is horrible UX.

So rather than a Lisp machine I would like to see something like Emacs built on top of a general purpose dialect of something like Haskell or ML where it’s possible to take a string like ‘README’ and say that from now on it’s a filename, and can be passed only to functions that want a filename, and not to functions that want a string. And hopefully get at least some libraries for free because people wrote them for their existing applications, because it’s a general purpose language, and not a language specifically made to write an IDE in.


Honestly, it exists, it’s an userspace OS of sorts. If you embrace GNU Emacs, you can do a ton of stuff without leaving out of it that much, and because of portability, it’s fairly easy to have a setup that works on pretty much any device and any operating system. The lisp dialect is really nice as well, and the experience of working with it is really pleasant.

1 Like

As a big fan of Common Lisp, I’m in.

I actually just decided today (or re-decided) that this is the only logical thing to do with my Reform, and I “just” need to figure out how to make Mezzano work on the LS1028 SOM. No FPGA needed; modern implementations compile to machine-code, and SBCL already works on the ARM architecture.

It won’t be a re-implementation of the Lisp Machine, but something a little more modern.

I didn’t know about Interim OS, but might just see what I can plunder from that, to make Mezzano work on bare metal.

Need some pointers to help you get acquainted with a modern Lisp? :slight_smile:


I think you may be conflating Lisp as a language and not as a concept, and also confusingly assigning emacs lisp as some kind of defacto standard

Emacs lisp is weird.

Racket, Scheme, Common lisp, Clojure are all lisps. You could make a machine based on any one of them, call it a lisp machine, and you’d be right.

Lisp has structure. It is capable of loops, functions, you name it. You get used to the parentheses, i promise.

Lisp has types. Symbols, strings, numbers, reals, ratios. These are all different object types in lisp. You can introspect the type of a given expression. you can throw an error if you don’t get the type you want.


And that’s the difference between Lisp and Haskell/ML.

In Lisp you can throw an error. In Haskell/ML the expression with a wrong type does not get executed at all.

And because the type information is part of the language syntax (as opposed being determined at runtime) it can be used for things like command completion or whatever is the equivalent called in an IDE.

What about modularity?

What kind of modularity would you like?

Just scratching the surface of what Common Lisp offers:

  • There are packages, which are effectively namespaces.
  • Then there are ASDF systems, which are similar to packages in other languages. ASDF is Another System Definition Facility, which has been the de facto standard for a long time now.
    The terminology is odd, but it’s from another era. They work well together, because one system can import another, and then add things to its namespace - this can be more useful than it sounds.
  • Then there’s CLOS, the Common Lisp Object System, which you can use to create an API via generic functions. I use it (among other things) to provide modularity and abstraction, so I can swap out things like database backends with no impact to the client code.

You can do the big-ball-of-mud thing if you really want to make life hard for yourself. However, it’s so easy to progressively refactor your code, it probably won’t stay in that form if it turns out to be genuinely useful. And you’ll most likely use other libraries while you’re writing it, anyway.

As for IDEs: Emacs is the canonical one, but I’m one of the heretics that prefers Vim. Lispworks and Allegro both provide more “modern” IDEs, but they’re commercial implementations and I don’t have a commercial budget. I’m pretty sure at least one of the Java IDEs has a CL module, too. “Horrible UX” is subjective; some people genuinely love Emacs, even if I regard its keybindings as an ergonomic disaster-zone.

At least one implementation of Scheme also has a packaging system. I assume others do, but I haven’t visited that side of things for a long time.

Now that I look back at your post, I see your desire to “take a string like ‘README’ and say that from now on it’s a filename, and can be passed only to functions that want a filename, and not to functions that want a string.” With CL, you can do that: create a pathname object from the string, and either specialise a method on it (if you’re doing OO) or declare the type of that parameter in a function definition. If you want to define a kind of object that doesn’t already exist, well, defclass should have you covered. And the types are most definitely not missing - you can even define your own, should you need to.

General-purpose libraries? Oh yes, we have those. Also web-app servers, database backends, music-related libraries… there’s a bit more to Lisp than writing a text-editor :slight_smile: But don’t take my word for it - start with Practical Common Lisp and then check out Quicklisp as one excellent means of downloading libraries.

In fact, if you dig back to its origins, John McCarthy was an AI researcher. Emacs turned up later, quite separately, and its dialect is now regarded as primitive (or long past due for a quiet disposal).


Having too many ways to do it may be as bad as no way to do it.

The big ball of mud is an observation of what most Lisp code turns into in practice. As to what would prevent that is an open question.

It’s for e-lisp, not for the general-purpose Lisp.

Some parts are, some aren’t.

Again, having too many ways to do a thing may be as bad as no way.

How if you want? It should be the fist thing you do when writing code.

Not with the canonical IDE, Emacs.

Well, I’m carrying on with making it my Lisp workstation :woman_shrugging:

On slightly calmer reflection, I won’t have the time or expertise to develop the necessary drivers for Mezzano any time soon, and there are deployment/production concerns to consider, so I’ll have to leave that as a happy daydream. Instead, I’ll use Linux as the base, and build on top of that. Might document the process as I go, in case others find it useful, entertaining, or both.

@Squizzler I’m happy to be a sounding-board, for whatever I’m worth, and to share any knowledge and pointers that’d be helpful.

1 Like

Note that important part of the language is also what is the established culture around it.

For example in python significant whitespace is, theoretically, completely optional. You can write code that does not use significant whitespace. However, all tutorials, examples, library code, etc. does use significant whitespace. That does not make it practically possible to avoid it when using python.

So what it theoretically possible in a language, and what language features and surrounding culture guide users to are separate things.

1 Like

I’m afraid I’m not that good with actual programming or anything practical really, more of an armchair scholar, but remain fascinated to what people come up with. I am delighted the thread has generated so much positive discussion.

I must say the purist in me preferred the first option but I can only guess at the work required :grinning:. The spirit of the presentation linked in the OP was to explore ways to eliminate the layers of “stuff” that make the modern graphical operating systems so massive, and Linux is very much part of that trend. Whilst awaiting Mezzano or Interim OS, could there be other ways that advance the state-of-the-art? The same speaker returned again this year to suggest Plan 9 might be better. The same spirit of advancing the state-of-the-art partly underpinned my name-check of EuLisp - not to mention patriotic feelings as an Erasmus Socrates alumni - as more recently specified than Common Lisp.


There have been attempts to fix that!

Anyway, I have read a more recent article based on the original talk and think this thread misses the point, really we ought to be discussing systems built on non-volatile memory. I will open a new thread!

It’s good that there are people even in the Lisp community aware of the problem. Getting used to the parentheses does not make the problem go away: using the parentheses correctly creates a mental overhead. What mental capacity you use on counting parentheses cannot be used on program logic, overall design, or whatever else there might be relevant for your project.

Given that picture is worth more than 1000 word’s let’s look at some possible ways to write expressions:

(+ a (+ b (+ c (+d e))))

+(a, +(b, +(c, +(d, e))))


a + b + c + d + e

The first two are much easier to parse, and save time, code, and whatnot when implementing the language. They are, however, non-local because the parentheses need to span the whole expression and result in hard to write, read, and modify expressions. Consider eliminating the ‘c’ variable from the program.

The latter two require more effort in the language implementation but pay off with much easier to write, read, and modify code when used reasonably.

Of course, this is a contrived example, and you would most likely get a standard library function for sum or write one. However, the same applies to more complex expressions with less regularity in structure for which generic function is unlikely to be available.

This is large part of the UX the language provides, and most Lisp dialects get this wrong.

I have seen a person writing in some Lisp (probably Guile) with an s-expression taking up most of the screen. It looked amazing. There was this blob of parentheses taking up about 1/3 of the screen with preceding and following blob poking out at the screen edges. It had colourful syntax highlighting. It had multiple rows of parentheses on both sides, and was somewhat unevenly indented as linebreaks were inserted at different nesting levels. No annoying commas to break up the roundness of the parentheses. It looks great as visual art but it’s not a practical way to write code, unfortunately.

And this gets me thinking that perhaps looking at these droplets and blotches of mud that are s-expressions all the time when writing code may be what primes people for ending up with a big ball of mud. When the most basic syntax building blocks of the languages are anti-modular modular is likely not how you would think about your code.

1 Like

I don’t think so.

Plan9 suffers from making the unix shell the OS interface.

Everything is a file except reading or writing some files might do something special, and you should ‘just know’. Also a file has only a filename, and a blob of data. If two applications share the same file, either because they access it as a plain file or because they are server-client in P9 they have to ‘just know’ what the other side meant. There is no option for multiple arguments, argument names, argument types, nothing like that. Passing multiple pieces of data atomically over such interface is a nightmare.

Making a language with introspection capabilities or at least some way to specify the interface would be much better.

The per-application namespaces are imperfectly emulated by containers, and are implemented in much more powerful and robust way by some modern microkernel based OSes.

The network transparency while unparalleled so far is also its weakness. While computing power and storage capacity has increased significantly since plan9 the network throughput not so much.

Today many systems are ‘terminals’ in the sense that they contain only cached data, and everything is stored in a cloud, and held hostage by the cloud provider. While some law may mandate that the service provider must provide a way to get the data it does not mandate it has to be intelligible or for free.

There is this concept of ‘data mobility’ which alludes it would be great if users could get their data in realtime in some intelligible way and forward it between different service providers to compose the services they provide. Multiple attempts at specification of a platform that would make this feasible failed to get any traction.

On the other hand, with robust application isolation through namespacing and other techniques such as provided by modern microkernel systems the services could be moved towards the data - that is the user could safely execute random code from the internet provided by other users or service providers, and service providers could safely execute code provided by the user to query their databases. Composing code from multiple origins would also be possible by passing the results around. This probably has more potential because often the data is much larger than the code, and because individual users can provide the code, there is no need for service providers to adopt the solution for it to be useful.

If the Internet is seen as one large NUMA system then plan9 is the approach of ‘execute whereever, the NUMA interconnect is fast enough to not care’ while data mobility and service mobility are attempts at NUMA optimization by improving NUMA locality, albeit so far only in design.


You left out the way a Lisp programmer would write it:

(+ a b c d e)

Even if you write it this way:

(+ a (+ b (+ c (+ d e))))

It’s trivial to remove the ‘c’ variable in Emacs with Paredit (which operates on S-expressions structurally rather than textually). Editing code with Paredit feels amazing actually! I miss it when working with languages with non-sexp syntax. Such languages can still be practical for writing code though. :slightly_smiling_face:

1 Like

This is fascinating since the usual narrative goes that Plan 9 was technically superlative but fell victim to “path dependency” because UNIX was already well entrenched. Before the many P9 users on Reform systems offer their rebuttal maybe it is worth separating this into a different thread?/


I suppose that also works only on expressions with uniform structure such as sum.

That’s interesting. Still it does not save you from looking at all those parentheses which make the code much less readable.

Basically all OS development since Windows NT (which is a slight improvement on UNIX, at least as far as the initial kernel design goes before Microsoft bought it) fell victim to industry inertia.

The ‘everything is a file’ paradigm caters to the UNIX shell as the OS interface, other programming languages offer support for objects with saner semantics. It is no surprise that Plan9 is an evolution of UNIX, it was made by the same team that made one of the unices before it. While it brings in some new paradigms a lot stays unchanged from UNIX to the detriment of the result.

1 Like