On UI as Code

Consequences of Undecidability in Physics on the Theory of Everything (PDF) from Journal of Holography Applications in Physics - June 2025.

General relativity treats spacetime as dynamical and exhibits its breakdown at singularities. This failure is interpreted as evidence that quantum gravity is not a theory formulated within spacetime; instead, it must explain the very emergence of spacetime from deeper quantum degrees of freedom, thereby resolving singularities.

Quantum gravity is therefore envisaged as an axiomatic structure, and algorithmic calculations acting on these axioms are expected to generate spacetime. However, Gödel’s incompleteness theorems, Tarski’s undefinability theorem, and Chaitin’s information-theoretic incompleteness establish intrinsic limits on any such algorithmic program. Together, these results imply that a wholly algorithmic “Theory of Everything” is impossible: certain facets of reality will remain computationally undecidable and can be accessed only through non-algorithmic understanding.

We formalize this by constructing a “Meta-Theory of Everything” grounded in non-algorithmic understanding, showing how it can account for undecidable phenomena and demonstrating that the breakdown of computational descriptions of nature does not entail a breakdown of science. Because any putative simulation of the universe would itself be algorithmic, this framework also implies that the universe cannot be a simulation.

Setting aside the question of whether any of this is provably true (or relevant to the subject of this thread) it seems there’s an existential anxiety in philosophy and science confronting the raw living reality that refuses to be reduced to logic and algorithms. The subject and user is not code, at least not entirely. The world can never be fully modelled, computed, and made legible by the state seeking total control.

(Note to self: Start dystopian sci-fi music/video album, Tentacles of the Mammon Matrix. First song “Escapology”.)


This undecidability in physics may be related to chaos theory and fractals in natural systems. Feedback loops, self-organization, replication and recursion approaching infinity.

stochastic-tree

(From Nature of Code by Daniel Shiffman - Chapter 7: Fractals)

An interface or language serves to reduce the complexity of the system and the user’s thoughts and intentions into a limited vocabulary with enough expressivity to perform communication and computation. (But not too verbose, beware the Turing tarpit.) Hm right, that’s what it’s for - the purpose of abstraction, “the process of generalizing rules and concepts”. We can only think by abstracting the world and any system we interact with.

Anatol Rapoport wrote “Abstracting is a mechanism by which an infinite variety of experiences can be mapped on short noises (words).”

An abstraction can be constructed by filtering the information content of a concept or an observable phenomenon, selecting only those aspects which are relevant for a particular purpose.

One-d-cellular-automaton-rule-110

What’s perplexing is that a single button or a couple of instructions is sufficient interface for a Turing-complete system. The minimalism of lambda calculus and combinatory logic, like in Programming with Less than Nothing, feels like watching a magician conjure an entire computation stack from a handful of symbols and rules.

It’s satisfying to get to the bottom of things, to break down a system into the smallest units and understand it thoroughly, to achieve total transparency and malleability. For me that’s the charm of Uxn or LÖVE, an endlessly expressive medium made of simple words. Also self-hosted languages like Guile and bootstrappable builds project like Mes, GNU+Linux from scratch with mutually self-hosting Scheme-to-C and C-to-Scheme compilers.

Inspired by autopoietic projects like these, I’ve continued exploring Lisp, WebAssembly, C.

Recent experiments include:

  • wasmos - a microkernel that runs Wasm binaries natively
    • Also trying a variety of smol Linuxes like Alpine, Tiny Core, ToyBox
  • uscheme - a tiny Scheme (5 keywords) to C compiler written in itself
  • cc-wasm - C99 to WebAssembly compiler that can compile itself
    • Runs anywhere Wasm can, including browser.
  • ulisp-c - C99 port of uLisp (originally written for embedded systems)
    Using Zig to cross-compile single-file executables
    • CPU architectures: x86_64, aarch64, riscv64, i386, wasm32
    • Operating systems: linux, macos, windows, freebsd, UEFI
  • CodeMirror editor with Lisp and C99 language features and structural editing
    • Parser, linter warnings and errors, formatter, inline help, autocomplete suggestions
    • Live editing of running programs

For now I’m trying to integrate a Lisp to C compiler written in Lisp; and a C to Wasm compiler written in C. The latter is small enough, I’m hoping, for the Lisp to gradually consume it like a slime mould and become a single self-hosting Lisp to Wasm compiler.

In parallel I’d like to direct the Lisp organism to digest HTML/CSS/SVG/MusicXML et al, into a new whole: a cross-platform DOM (document object model); design system (styling primitives); and declarative behaviors with signals and/or state machines. All integrated as a single programmable tree structure. Unified user interface as code, computational medium as living book, written in the same universal language.

(Second song “Lisperanto”.)


Aristid Lindenmayer, a Hungarian theoretical biologist and botanist, worked with yeast and filamentous fungi and studied the growth patterns of various types of bacteria.

A Lindenmayer system, or L-system, is a parallel rewriting system and a type of formal grammar. It consists of an alphabet of symbols that can be used to make strings, a collection of production rules that expand each symbol into some larger string of symbols, an initial “axiom” string from which to begin construction, and a mechanism for translating the generated strings into geometric structures.

Originally, the L-systems were devised to provide a formal description of the development of simple multicellular organisms, and to illustrate the neighbourhood relationships between plant cells. Later on, this system was extended to describe higher plants and complex branching structures.

How beautiful. I’m making the connection now that this is the same author of The Algorithmic Beauty of Plants (PDF), “the first comprehensive volume on the computer simulation of certain patterns in nature found in plant development.”

3 Likes

One fun thing I’ve been working on is to compile tinylisp with tinycc (which is now a standard package in many Linux distributions!) and then augment it with bindings to SDL3. Now I have a Lisp REPL attached to a graphical canvas. Very LÖVE-inspired, but it feels more parsimonious than say just adding a Fennel dependency atop LÖVE. On the other hand, I now have bugs I need to track down :sweat_smile:

3 Likes

When User Hits Machine” is a short film produced at Xerox PARC in 1983 as part of research on human-computer interaction. They were trying to understand why the newest model of photocopier became notorious among its users for being “too complicated” and difficult to use.

In order to explore these questions in detail we got a machine ourselves and installed it in our workplace. I then invited others of my co-workers, including some extremely eminent computer scientists, to try using the machine to copy their own papers for colleagues, with the understanding that a video camera would be rolling while they did so.

A scene shows Allen Newell, one of the founding fathers of AI, and his PARC colleague Ron Kaplan, a computational linguist, struggle with the photocopier for an excruciatingly long time to make two-sided copies of a research paper.

Further studies showed that the “same” events measured by the company could be experienced in radically different ways by machine users, depending on just how those events were embedded in a specific worksite and course of activity.

For example a paper jam encountered with a knowledgeable person nearby to consult on how to clear it might be experienced by few and quickly forgotten, while the same paper jam encountered at 5:00 by a harried worker unfamiliar with the machine and with no one else in sight would be experienced as a major breakdown.

The boundaries of the machine were less defined by the box, more contingent on its relations with its environment.

(From introductory remarks at WPT Fest 1999 by Xerox PARC anthropologist Lucy Suchman.)

The film was shown to researchers and engineers at Xerox, and it led to significant changes in interface design, including the addition of the now ubiquitous large green button that allows users to quickly and easily make a copy.


These are the buttons on a new Xerox WorkCentre copier. The icons used to be words: Start, Stop, Clear. Some of them are duplicated on the buttons, barely visible, and there’s a braille-like . (dit) and - (dah). At least they could have used audio player icons with internationally recognized meaning.

image


When User Hits Machine - Xerox S436 can be watched on archive.org.

2 Likes

Your Lisp organism sounds quite similar to something I have been thinking about for a while. The main difference I see is that I’d use a full Common Lisp, for the conveniences it offers but more importantly for the huge amount of time-tested code there is out there. That makes bootstrapping more challenging, but fortunately bootstrapping chains for Common Lisp already exist.

This week I saw some discussion about Haskell and Ada, how they’re not fully bootstrappable because they rely on magic binary blobs to get started, I think an older version of its own compiler - and how this has implications on trust and security.

Unlike Nix and Guix, Stagex goes much further in that it has a 100% mandate on supply chain integrity. It trusts no single maintainer or computer and disallows any binary blobs. It is thus not possible to package any software that cannot be bootstrapped, reproduced, and signed by at least two maintainers.

Haskell and Ada are the only languages not possible for us to support, or any software built with them. They’re locked out of any high security applications until they are bootstrappable.

More thoughts on a bootstrappable GHC

Here’s the stagex mentioned.

Minimalism and security first repository of reproducible and multi-signed OCI images of common open source software toolchains full-source bootstrapped from Stage 0 all the way up.

If you want to build or deploy software on a foundation of minimalism and determinism with reasonable security, stagex might be the solution you are looking for.

OCI (Open Container Initiative) - https://opencontainers.org/

This Containerfile shows how it bootstraps “everything” - the basis for building C, C++, Rust, Go, Python, Node, etc. - deterministically from “180 bytes of human auditable x86 machine code” (hex0).

https://codeberg.org/stagex/stagex/src/branch/main/packages/bootstrap/stage0/Containerfile

I see this is a continuation of the GNU Mes story. Stagex builds on their work to provide a convenient way to bootstrap toolchains for major programming languages. Common Lisp is not in the list but as you mentioned they’re already working on bootstrappability.

SBCL: a Sanely-Bootstrappable Common Lisp (PDF)

This paper describes the development of an implementation of Common Lisp with the peculiarity that it is bootstrappable neither solely from itself, nor from some other language, but rather from a variety of other Common Lisp implementations.

Bootstrapping Common Lisp using Common Lisp (PDF - This domain hal.science I often see when searching for scientific articles, I think it’s a non-profit in France; so the site is reputable, but it has the unfortunate UX design of starting a PDF file download automatically when visiting a page. Every time I visit the link I get another copy of the article. It’s an example of unintentionally taking away control from the user.)

Some Common Lisp implementations evolve through careful modifications to an existing image. Most of the remaining implementations are bootstrapped using some lower-level language, typically C. As far as we know, only SBCL is bootstrapped from source code written mainly in Common Lisp.

We describe the bootstrapping technique used with SICL, a modern implementation of Common Lisp.

  • Source: https://github.com/robert-strandh/SICL - SICL is a new implementation of Common Lisp. It is intentionally divided into many implementation-independent modules that are written in a totally or near-totally portable way, so as to allow other implementations to incorporate these modules rather than having to maintain their own implementation-specific versions.

That Git repository is a rich resource, a curated collection of general-purpose Common Lisp code and tons of documentation, papers and notes as LaTeX documents. I’m impressed the majority of the project is .tex files discussing the author’s thoughts and related research on the language, its implementation and features.


I’ve always wanted to learn LaTeX but hesitant of the perceived learning curve. Lately I’m reading many academic papers, some of which are beautifully formatted - I’m guessing they’re written in LaTeX. I’d love to write an article or story in a suitable format to produce a PDF/EPUB file with nice typography, table of contents, images, illustrations.

I like writing Markdown though, so I’ll see if there’s a way to integrate/embed .md and .tex files.

Reminds me of pandoc, a document format converter written in Haskell.

Pandoc: a universal document converter - https://pandoc.org/

If you need to convert files from one markup format into another, pandoc is your swiss-army knife.

Supported document formats include:


A jungle of languages and dialects, a confusion of tongues with no Lisperanto or the one ring to rule them. I’m curious how Pandoc is designed to process all these formats - I imagine by parsing the document into an intermediary universal syntax tree? I’ll read the source and see.

Relatedly there’s a project called Unified, a document processing pipeline used by MDX, Gatsby, Docusaurus.

unified is a friendly interface backed by an ecosystem of plugins built for creating and manipulating content. It does this by taking markdown, HTML, plain text, or other content, then turning it into structured data, thus making it available to plugins. Plugins for example do tasks such as spellchecking, linting, or minifying.

..Instead of manually handling syntax or parsing, you typically write one line of code to chain a plugin into its process.

unified itself is a small module that acts as an interface to unify the handling of different content formats. Around a certain format there sits an ecosystem.

The ecosystems:

The specifications for syntax trees:

Other building blocks:

  • MDX — markdown and JSX
  • micromark — small, safe, and great CommonMark (and GFM) markdown parser
  • syntax-tree — low-level utilities for building plugins
  • vfile — virtual file format for text processing

syntax-tree is an organization that contains 100+ projects that deal with syntax trees based on unist. These trees typically deal with content: markdown (mdast), HTML (hast), natural language (nlcst), XML (xast), but also JavaScript (esast).

unist is the schema for a universal syntax tree, from which all document formats extend, with a node as the atomic unit.

interface Node {
  type: string
  data: Data?
  position: Position?
}

interface Parent <: Node {
  children: [Node]
}

Recently I saw an announcement that GHC (Glasgow Haskell Compiler) successfully compiled itself to target WebAssembly.

The demo page is a bit hefty with 50 Mb of Wasm, but for educational purposes it’s arguably a leap forward in making the language easily accessible in the browser. Now they can build a Haskell playground, tutorials with live code examples, etc.

The same project managed to compile Pandoc Wasm using GHC.

Other recent efforts in this direction:

2 Likes

My resonant idea is to make Cardumem to be the slime mold of TiddlyWiki, first complementing it and later replacing it, in most of our social practices, while still sharing with TiddlyWiki the exploration of an hypertextual algebra. This semester we already started by making Cardumem the bridge between TiddlyWiki and Fossil SCM for my students’ wiki portfolios (previously we did that with TiddlyWikiPharo, but Cardumem is more minimalist/focused).

Because of its wiki web nature, Cardumem will also digest/model HTML/CSS/SVG into a ever green publication with programmable cards structure (this idea is inspired by TiddlyWiki, Zettlekasten, Hypercard and my own personal analog/digital card notes) and support for live coding. Languages and stacks are still a mixture (Lua, YueScript, HTMX, Datastar, Djot, Lunamark), with Lua as the bridge.

Because my focus is on the social part of the malleable systems sociotechnical bootstrapping, instead of the technical part, I start from how to insert a meaningful/useful (meta)tool in a small collective and continue from there. That’s why I’m (de)constructing the metatool, acknowledging social messiness and web practicalities, instead of bootstrapping from a more computational minimalist experience, like @akkartik’s LÖVE or @eliot’s Lisp experiments/prototypes, which are pretty interesting in several ways, just too far apart from the usage experience I want to bridge with, while wikis and the web are closer. This doesn’t preclude that, at some point live programmable wiki cards can be render from LÖVE or become a tree in a Lispy language like Fennel. So, hopefully, some bridges between our projects and approaches may be ahead. For the moment, this help us to understand the different ways we approach to malleable systems and all the value on such diversity. So, thanks for such resonances.

4 Likes

Then, you may find in Typst a good balance between a minimalist extensible easy to learn syntax and the production of beautiful PDFs

2 Likes

I am caught in between these two approaches, being involved with communities with different needs and different technological backgrounds. So I am looking on both sides for new ideas.

One issue I frequently have with the social part is something one could call “toxic technology”: widely known and used technology that you should ideally build on for that very reason, but which is also an obstacle to the long-term goals of the community.

My primary example is Python. In my research communities, almost everyone has a basic knowledge of Python these days, and even those who don’t are fully convinced that Python the the language you should know in our field. But the Python ecosystem has adopted a rhythm of breaking changes that is not compatible with the much slower evolution of scientific knowledge in our field. This leads to loss of knowledge (e.g. interesting methods implemented in Python code that no longer works) and to much energy being lost in maintaining code in a fight against tech churn. Python is the fossil fuel of science: long-term, we can’t live with it, but short-term, we can’t live without it.

In many other contexts, Google Docs is toxic technology. Or communication tools such as Facebook or X.

If anyone has an idea for dealing with toxic technology, I am very interested.

3 Likes

I hadn’t seen StageX before. I will need to take a closer look at it, but the Containerfile you point to looks a bit suspicious. Reproducible builds require an isolated build engine, and I don’t see where it comes in. Maybe it’s invoked at a different level of the build process.

HAL is operated by French public research institutions. It has good screening for spam, plagiarism, off-topic content etc., but the team behind it doesn’t even try to validate the claims made in the uploaded articles. This means it is classified as a preprint server, in the spirit of arXiv (the very first one) and by now many others.

I don’t see the PDF issues you mention. Nothing is downloaded unless I explicitly click on “save” (the PDF).

SICL is a very useful collection if you want to work with Common Lisp code. I am a happy user of its reader (what other languages call a parser), for example. But I am not sure we will ever see a complete compiler coming out of it.

Many scientific authors write in Markdown and translate to PDF via LaTeX using pandoc. But if you want to tweak the looks of the final PDF, you need to do that at the level of pandoc’s LaTeX templates, which means knowing LaTeX at beyond beginner level.

Yes. See this quote from the first page of the manual:

Because pandoc’s intermediate representation of a document is less expressive than many of the formats it converts between, one should not expect perfect conversions between every format and every other.

2 Likes

Hmm, as @khinsen mentions, I also don’t observe any saving of a downloaded file just from visiting HAL article pages… but the PDF is embedded into the page via an iframe, and perhaps some browser or non-default configuration might turn that into a file download? At least in my Firefox on Linux configuration, the PDF appears embedded in the page without storing a downloaded file. (Assuming I understand the issue you are mentioning correctly…)

1 Like

Aha, that was it, my browser setting for this file type was set to “Save File”.

When I changed it to “Open in browser”, the PDF is shown in the iframe and does not trigger a download. So the user (me) was blaming the wrong party, sorry to besmirch your good name hal.science!

(But it’s typical of users confused by a technical issue, like with the Xerox copier. “When user hits machine” must be a classic in the field of UX design, because often it’s the only degree of freedom left for the user. Remember when people used to hit their television at a certain angle to fix a glitch? Pepperidge Farm remembers.)


typst does look good, as a concept for multimedia document authoring, and the overall project with polished design, business savvy (path to funding), quality code and documentation. Thanks for recommending, I’ll learn more about it.

typst collaborative editor record


TiddlyWiki and Fossil SCM .. Lua, YueScript, HTMX, Datastar, Djot, Lunamark

A stack of exotic spells - I’m not familiar with any except Lua, a cute language with great power for extensibility. I’ve written snippets of Lua for Nginx server, Ardour audio workstation, LÖVE game engine. And djot I’d seen before and noticed in the list of document formats supported by Pandoc.

The name Cardumem suits this richly flavored stack - oh actually I’m thinking of Cardamom.

La palabra del día: Cardumen - Por ejemplo, cardumen de sardinas. Banco de peces que marchan juntos en grandes cantidades.

Ya usaba este vocablo en 1629 el cronista jesuita andaluz Bernabé Cobo, en su Historia del Nuevo Mundo: “Son estos pájaros blancos y pardos, tan grandes como una Gallina; andan á bandadas tras el cardumen de Anchovetas y Sardinas.”

So it’s like a “school” of fish or a “flock” of birds. The different suffix -mem reminds me of memory and memex, that works for a wiki engine.

Been reading the Cardumem wiki and related articles on mutabit, good food for thought. I appreciate how you document the journey of development over the years, what the community needs are, how the tech stack changed, the motivation behind the project.

inspired by TiddlyWiki and backwards compatible with its data

Right, I see the transportability of data is the bridge, how users can move or widen their workflow over the boundary between applications. I’m working on a similar effort with WordPress, for clients and users, about a thousand sites - a gradual migration to newer stacks, better in many ways but also repeating the same mistakes; so I’m encouraging static sites where possible, back to the basics of the web; and a stack-agnostic approach to site contents and database, to not get stuck in yet another stack. There the key is a JSON API for data schema/exchange/export and remote programmatic control from any language.

I’ve wondered about this too - even just as a thought experiment, setting aside the practical purpose of doing so. What document, data or code can run in Uxn, uLisp, Cardumem, Glamorous Toolkit, Lua/LÖVE/Freewheeling app, and the web browser. Mm, not the right question, I mean: how can a piece of data or code travel across these environments?

Like with Pandoc and Unified, it’d be useful to have an intermediary universal syntax tree for converting documents/programs on these platforms.

Another thing that would help is, like with Stagex, a convenient way to install and run these runtimes in a container. Ideally with a single command. That’s a friction point with Cardumem, one of the requirements is installing Fossil SCM on the user machine.

Or, if not a container, then a sandbox like the web browser - which often means compiling to WebAssembly. I’m not totally convinced of the direction Wasm is headed after version 1, but it’s becoming ubiquitous as a kind of universal cross-platform binary target. If a piece of code can be compiled to Wasm (or C first), it can run anywhere.


One of my motivations is that I want the ability to write documents with music scores, notation and interactive diagrams - and make it accessible for teachers and students to write and communicate on musical topics. That’s based on HTML, MusicXML, maybe web components.

There’s this great resource, an online book titled Music Theory for the 21st-Century Classroom. From the acknowledgements:

I owe a huge thanks to Robert Beezer for recommending PreTeXt (formerly “MathBook XML”) as a means to author Music Theory for the 21st–Century Classroom. His work creating the “world” of PreTeXt made it easier than I could have imagined to create this text in all its forms (online, PDF, and print).

Also of incredible value, and without whom this text would not exist, is Jahrme Risner, who helped me wrap my head around the nitty gritty of PreTeXt and patiently coached me through entering commands in the terminal.

What leaves much to be desired is that all the music examples in the book are YouTube embed links; and the music notation is in rendered PNG images, not SVG or any kind of structured data. There’s potential and opportunity for better user experience with explorable explanations as teaching tool. Here we have this expressive computational substrate called the web, and we cannot yet comfortably write multimedia interactive textbooks for the 21st-century classroom.

Hypermedia, an extension of hypertext, is a nonlinear medium of information that includes graphics, audio, video, plain text and hyperlinks.

Like computers and software, I think hypermedia is still a new kind of meta-medium that continues to evolve, mutate, adapt, fail, survive and flourish; and we’re learning to shape it (as it shapes us) and to express ourselves with it more fluently, as we acquire “hypermedia literacy” as individuals and society.


An example I saw of an indie hypermedia document made with Decker.

It demonstrates a visual language that speaks through text, colors, drawings - a personal way of sharing thoughts. I learned about it from here:

Drawing in WigglyPaint

3 Likes

This is the key question that I struggle with daily, just as a user. Where “environment” can be all of:

  • multiple websites (different chat forums running on different platforms)
  • multple OSes (after five years of using Windows as my daily driver at home triggered by Covid work-from-home, I’m in the process of migrating back to Linux triggered by Microsofts end-of-lifing Windows 10 and my perfectly good hardware being considered not good enough to run Win 11. On the whole, the Linux experience is better, but it’s still rough shifting things across different OSes. It’s amazing, for example, how poor the 2025 user experience is of just copying a file between NTFS, ext3, and a USB hard drive formatted as ExFAT. Some files apparently lose metadata that I didn’t even realise they had! Why is an operation as fundamental to software quality as “file copy”, in 2025, still so unspecified that it’s even possible to lose data during a copy?)
  • multiple languages (human and programming; Unicode support on Windows still isn’t great, for example Lua on Windows just doesn’t do Unicode at all.)
  • desktop vs server (and what you have to go through to get an IP address or a domain name today)
  • desktop vs mobile (and especially now that Google has activated the nuclear option of destroying sideloading on Android, so F-Droid and Termux’s days are numbered. Termux + Node isn’t great as a cross-platform transportable pocket/desktop computing environment, but it’s all I’ve got right now.)

I want there to be some kind of “fabric” where I can just move (like copying a file - but preferably without random invisible data loss) chunks of “computational environment” between personal home server, personal pocket device, personal laptop-sized device, or a rented Someone Else’s Computer In Another Country (so I can communicate to the world, while keeping my core secrets and access keys off that computer). It needs to be super-easy to regularly migrate chunks of that environment (of any size) between any and all of those devices, for backup and disaster recovery purposes, so moving those chunks should not require any changes to the chunks themselves.

3 Likes

I guess that’s one of the Holy Grails of malleable systems! Unfortunately, it is not a technical but a social problem, so there is no hope for a simple solution. It’s mostly a problem of making lots of people agree on some compromise. Which is something that techies are particularly bad at, for different reasons.

1 Like

And not the smallest of those reasons being that many zealous open-source techies were bribed in the last two decades with elite salaryman jobs from Google, Amazon, Facebook, and the other American Zaibatsu, and in the process came to lose their natural cyberpunk fear of mega-corporations and talked themselves into the idea that square was the new hip and that the best compromise should be whatever the megacorps wanted.

But yeah, also programmers not being able to agree on anything didn’t help.

1 Like

As a computer-literate teenager in the early days of the web, I watched this happen as I grew up, a kind of disillusionment. As we know, the computer subculture (“hacker culture”) back then was influenced, or rather grew out of the historical context of the 1960’s counterculture. Like the Homebrew Computer Club, Whole Earth Catalog and The WELL, research labs at MIT, Stanford, UC Berkley, Carnegie Mellon. The GNU project and Linux too, naturally evolved out of that rich soup with a spirit of independence and cognitive freedom.

The highlights are chronicled and mythologized in books such as What the Dormouse Said, The Soul of a New Machine, From Counterculture to Cyberculture, and Tools for Thought: The History and Future of Mind-Expanding Technology. But none can quite capture the magic of what it felt like to live in it and be part of the community. It was enough of a minority group that it felt special to be part of it, where people met and recognized each other as being of a different breed. (Some of that lives on as snobby elitism among the technocrati.)

How that subculture had to grow up into adulthood, losing its innocence, playfulness, and healthy distrust of authority, is not as charming of a story. The wilderness was colonized, domesticated, and consumed by mainstream capitalist culture with its system of values, ethics and principles (or lack thereof).

Yet I remain an optimist, even as the Great Nothing swallows the world around us as we used to know (as in Die unendliche Geschichte). Just as the World War never really ended the unspeakable evil as it fractured, went underground, and conspired over decades to rise again as a new decentralized and distributed monster, more powerful than ever, threatening to achieve global domination and total control - so did ancient forces of good, the pagans, witches and wizards, survive the persecution by planting seeds over generations. Families, tribes and nations were lost, tongues and stories forgotten, villages and libraries burned to the ground. We arrived at the cyberpunk dystopia of our collective nightmares, but the proof of the pudding is in the eating: everywhere there’s evidence of a lost civilization, the futures we could have had, its genetic material spread across the noösphere, some dormant and waiting for an artist to inspire, others expressing themselves like dandelion flowers breaking through the concrete.

There are historical forces moving through the centuries, of which hacker culture was a temporary manifestation, like the Renaissance, Art Nouveau, Dada, the Surrealists, the Beats. By their nature such movements are ephemeral, emerging like mushrooms overnight when the time and place is right, then just like that they’re gone, leaving people to reminisce in nostalgia, stories so fantastic no one would believe them. There are legends of magic faery cities that appear in the desert one day, a festival of otherworldly beauty and music - and poof, disappear into the air by morning, taking with them fair maidens and lads who ran away with the circus.

“The elves have left Middle Earth” is a phrase sometimes heard even among the furnaces of Mordor, I mean bigcorps of Silicon Valley like Apple Inc., to speak of wizards who built some inspired tech then left the company for greener pastures, hopefully leaving behind enough documentation. Xerox PARC was one of those miracles, a spacetime nexus of collective inspiration, here today and gone tomorrow. What’s great about such temporary autonomous zones and pirate utopias is that they can generate enough insight for following generations to study and implement, as a rich cultural heritage. The re-enchantment of the world is not inevitable but always possible, waiting behind the veil.

3 Likes

Yep. My introduction to personal computing was a decade earlier, in the 8-bit 1980s, with libraries still full of “Creative Computing” magazine, and even the mainstream computing books talking anxiously about the grave dangers of mainframes and centralised identity systems, and how the only guard against tyranny was an informed and computer-literate citizenry with our own personal machines that we could always reprogram.

And now we’re here.

Well, we might have lost the machines in our pockets, but we’ve at least got a whole bunch of second-hand Intel/AMD desktops and notebooks that have to forcibly exit the Windows ecosystem all at once to survive. (For as long as the batteries and other replacement parts last, at least). If there’s ever been or will be a “day of the Linux Desktop”, it’s Patch Tuesday of next week.

1 Like