Collapse Computing

In short, collapse computing is about taking advantage of today’s abundance in computing power to prepare for a future in which infrastructures have collapsed. A post-collapse society that has eventually lost all of its artificial computing capacity may still want to continue the practice of computer science for various reasons.

This is not so much to address the question of What computing is for. I thought it might be fun to have a dedicated thread on the topic. There seems to be overlaps between permacomputing and malleable systems where the knowability of a project, necessarily for it to be malleable, goes hand-in-hand with software preservation and software resilience.

image

Collapse computing prioritizes community needs and aims to contribute to a knowledge commons in order to sustain the practice of computation through infrastructure collapse, it is the practice of engaging with the discarded with an eye to transform what is exhausted and wasted into renewed resources.

CollapseOS’ goals

I think it covers some good points so I’ll share them:

  • Run on minimal and improvised machines.
  • Interface through improvised means (serial, keyboard, display).
  • Edit text and binary contents.
  • Compile assembler source for a wide range of MCUs and CPUs.
  • Read and write from a wide range of storage devices.
  • Assemble itself and deploy to another machine.

Designing for Descent ensures that a system is resilient to intermittent energy supply and network connectivity. Nothing new needs producing and no e-waste needs processing. If your new software no longer runs on old hardware, it is worse than the old software. Software should function on existing hardware and rely on modularity in order to enable a diversity of combinations and implementations. It is about reinventing essential tools so that they are accessible, scalable, sturdy, modular, easy to repair and well documented.

Collapsology

The following comes from this excellent series on Collapse Computing.

Collapsology studies the ways in which our current global industrial civilization is fragile and how it could collapse. It looks at these systemic risks in a transdisciplinary way, including ecology, economics, politics, sociology, etc. because all of these aspects of our society are interconnected in complex ways. I’m far from an expert on this topic, so I’m leaning heavily on the literature here, primarily Pablo Servigne and Raphaël Stevens’ book How Everything Can Collapse (translated from the french original).

So what does climate collapse actually look like? What resources, infrastructure, and organizations are most likely to become inaccessible, degrade, or collapse? In a nutshell: Complex, centralized, interdependent systems.

Failure Scenarios

So to summarize, this is a rough outline of a potential failure scenario, as applied to computing:

  • No new hardware: It’s difficult and expensive to get new devices because there’s little to no new ones being made, or they’re not being sold where you live.
  • Limited power: There’s power some of the time, but only a few hours a day or when there’s enough sun for your solar panels. It’s likely that you’ll want to use it for more important things than powering computers though…
  • Limited connectivity: There’s still a kind of global internet, but not all countries have access to it due to both degraded infrastructure and geopolitical reasons. You’re able to access a slow connection a few times a month, when you’re in another town nearby.
  • No cloud: Apple and Google still exist, but because you don’t have internet access often enough or at sufficient speeds, you can’t install new apps on your iOS/Android devices. The apps you do have on them are largely useless since they assume you always have internet.

Assumptions to Reconsider?

Core parts of the software development workflow are increasingly moving to web-based tools, especially around code forges like Github or Gitlab. Issue management, merge requests, CI, releases, etc. all happen on these platforms, which are primarily or exclusively used via very, very, slow websites. It’s hard to overstate this: Code forges are among the slowest, shittiest websites out there, basically unusable unless you have a fast connection.

This is, of course, not resilient at all and a huge problem given that we rely on these tools for many of our key workflows.

However, while free software has structural advantages that make it more resilient than proprietary software, there are problematic aspects of current mainstream technology culture that affect us, too. Examples of assumptions that are pretty deeply ingrained in how most modern software (including free software) is built include:

  • Fast internet is always available, offline/low-connectivity is a rare edge case, mostly relevant for travel
  • New, better hardware is always around the corner and will replace the current hardware within a few years
  • Using all the resources available (CPU, storage, power, bandwidth) is fine

Build for Repair

In a less connected future it’s possible that substantial development of complex systems software will stop being a thing, because the necessary expertise will not be available in any single place. In such a scenario being able to locally repair and repurpose hardware and software for new uses and local needs is likely to become important.

Repair is a relatively clearly defined problem space for hardware, but for software it’s kind of a foreign concept. The idea of a centralized development team “releasing” software out into the world at scale is built into our tools, technologies, and culture at every level. You generally don’t repair software, because in most cases you don’t even have the source code, and even if you do (and the software doesn’t depend on some server component) there’s always going to be a steep learning curve to being able to make meaningful changes to an unfamiliar code base, even for seasoned programmers.

In a connected world it will therefore always be most efficient to have a centralized development team that maintains a project and makes releases for the general public to use. But with that possibly no longer an option in the future, someone else will end up having to make sure things work as best they can at the local level. I don’t think this will mean most people will start making changes to their own software, but I could see software repair becoming a role for specialized technicians, similar to electricians or car mechanics.

Efficient vs. Repairable

The goals of wanting software to be frugal with resources but also easy to repair are often hard to square. Efficiency is generally achieved by using lower-level technology and having developers do more work to optimize resource use. However, for repairability you want something high-level with short feedback loops and introspection, i.e. the opposite.

An app written and distributed as a single Python file with no external dependencies is probably as good as it gets in terms of repairability, but there are serious limitations to what you can do with such an app and the stack is not known for being resource-efficient. The same applies to other types of accessible programming environments, such as scripts or spreadsheets. When it comes to data, plain text is very flexible and easy to work with (i.e. good for repairability), but it’s less efficient than binary data formats, can’t be queried as easily as a database, etc.

Related Efforts

Articles on preservation

Carrying Authentic, Understandable and Usable Digital Records Through Time
Rothenberg, J.; Bikson, T.; RAND-Europe (Date Created: 6 Aug 1999) (The Netherlands)
This report to the Dutch National Archives and Ministry of the Interior presents the results of a study to define a strategy and framework for the long-term management and preservation of digital governmental records. A “testbed” in which specific digital preservation techniques can be prototyped and evaluated is proposed.
http://www.archief.nl/digiduur/bibliotheek/final-report_4.pdf

Digital Preservation & Emulation: From Theory to Practice
Granger, Stewart (Date Created: 2001)
To be presented at the September 2001 ICHIM Conference, this paper examines emulation as a digital preservation strategy. As well as drawing on mathematical theory, the author analyses the practical role of emulation in the preservation of digital materials.
http://dspace.dial.pipex.com/stewartg/sgichim.htm

Digital Preservation: A Report From the Roundtable Held in Munich, 10-11 October 1999
Beagrie, Neil; Elkington, Nancy In: RLG DigiNews (Date Created: 15 Dec 1999) (Germany)
This report about the Roundtable meeting held in Munich in October 1999, hosted by the Bavarian State Library and the German National Library, provides a brief summary of current research and programmes for digital preservation and an overview of the situation within Germany itself.
http://www.rlg.org/preserv/diginews/diginews3-6.html#confrep
In RLG DigiNews Volume 3, Number 6

Digital Preservation: State of the Art November 1999 Update
Fresko, M. (Last Updated: 16 Dec 1999)
This update on previous research, was prepared for the November 1999 concertation meeting, Consolidating the European Library Space.
http://www.cordis.lu/libraries/events/fp4ce/speech/hs~cox.html

Digital Rosetta Stone: A Conceptual Model for Maintaining Long-term Access to Digital Documents
Heminger, Alan R.; Robertson, Steven B. (Date Created: 17 Jun 1998)
This document describes a method for digital preservation relying on keeping a “meta-knowledge archive” of how to interpret media formats and file formats to support data recovery and document reconstruction processes.
http://www.ercim.org/publication/ws-proceedings/DELOS6/rosetta.pdf

Emulation : C-ing Ahead
CAMiLEON Project; Holdsworth, David (Date Created: Dec 2000)
A draft paper by David Holdsworth advocating the use of widely available programming languages, such a C, in the implementation of emulation as a digital preservation technique.
http://www.leeds.ac.uk/CAMiLEON/dh/cingahd.html

Emulation as a Digital Preservation Strategy
Granger, Stewart In: D-Lib magazine (Date Created: Oct 2000)
An article focusing on the logistics of using emulation as a digital preservation strategy when contrasted against other possible approaches and user requirements. Extensive analysis is undertaken of Jeff Rothenberg’s advocacy of emulation over other preservation strategies. It discusses the potential advantages of emulation, such as elegance and recreation of " look and feel ", and the possible disadvantages including that emulation may be problematic due to the undefined nature of technological change and the complexity of creating emulator specifications. The conclusion is that emulation is not a complete digital preservation solution, rather a partial one.
http://www.dlib.org/dlib/october00/granger/10granger.html
D-Lib Magazine October 2000, Vol. 6, No. 10

Emulation, Preservation and Abstraction
Holdsworth, David; Wheatley, Paul; CAMiLEON Project (Date Created: 2000)
An article by David Holdsworth and Paul Wheatley outlining the advantages of emulation as a digital preservation strategy. The issues of longevity and affordability are stressed, and an example of emulation as applied to the George3 operating system is provided.
http://www.leeds.ac.uk/CAMiLEON/dh/ep5.html

Emulation: RISC’s Secret Weapon
Halfhill, Tom R.; BYTE.com (Date Created: 15 Apr 1994)
This paper is a discussion of emulation technology in a commercial rather than a preservation context. The first section contains a general technical background about how emulation works. The article continues with a detailed discussion of emulation products available at the time (1994), mainly for emulation of contemporary machines on opposing platforms. The opinion is expressed that there are problematic issues associated with emulation such as slow speed but that it also has potential for future use.
http://www.byte.com/art/9404/sec8/art3.htm
Byte Special Report, April 1994

An Experiment in Using Emulation to Preserve Digital Publications
Rothenberg, Jeff (Date Created: Apr 2000) (Netherlands)
This report presents the results of a preliminary investigation by the National Library of the Netherlands into the feasibility of using emulation as a means of preserving digital publications in accessible, authentic, and usable form within a deposit library.
http://www.kb.nl/nedlib/results/emulationpreservationreport.pdf

Long-term Preservation of Electronic Publications: the NEDLIB Project
Titia van der Werf-Davelaar, Koninklijke Bibliotheek, The National Library of the Netherlands In: D-Lib magazine (Date Created: Sep 1999)
This article describes the NEDLIB project and explains how it is developing a deposit system for electronic publications (DSEP) on the basis of the OAIS model. Metadata for preservation and emulation experiments are also discussed.
http://www.dlib.org/dlib/september99/vanderwerf/09vanderwerf.html

Metadata for digital preservation: an update
Day, Michael In: Ariadne (Date Created: Dec 1999)
In this article, Michael Day reviews recent activities related to the development of metadata schemes for digital preservation. He outlines the emulation and migration-based strategies for digital preservation and the role that preservation metadata is poised to play in both these approaches. He reviews both library and archives-based initiatives in this area. He describes the OAIS model and its implications for the development of preservation metadata.
http://www.ariadne.ac.uk/issue22/metadata/
In Ariadne, Issue 22

Metadata to Support Data Quality and Longevity
Rothenberg, Jeff (Last Updated: 3 Jun 1996)
This paper discusses two key needs for metadata: to support data quality and to ensure the longevity of data. In the latter category, the need for encapsulation with metadata which will enable future emulation is discussed.
http://www.computer.org/conferences/meta96/rothenberg_paper/ieee.data-quality.html

Migration : a CAMiLEON discussion paper
CAMiLEON Project; Wheatley, Paul (Date Created: 2000)
A paper by Paul Wheatley considering the different options for migration as a digital preservation technique.
http://www.personal.leeds.ac.uk/~issprw/camileon/migration.htm

Preservation 2000
Michael Day In: Ariadne (Date Created: Dec 2000)
This paper is a review of the Preservation 2000 conference and the Information Infrastructures for Digital Information workshop held in York, UK in December 2000.
http://www.ariadne.ac.uk/issue26/metadata/

Preserving Information Forever and a Call for Emulators
Gilheany, Steve; Archive Builders (Date Created: 17 Mar 1998)
This paper reviews some of the issues associated with planning to store information indefinitely. Long-term preservation must include a plan for preserving metadata as well as data. This paper also discusses the need for emulators to permanently preserve the functionality of the computer.
http://www.archivebuilders.com/aba010.html
Presented at the Digital Libraries Conference and Exhibition The Digital Era: Implications, Challenges and Issues, 17-20 March 1998, Singapore

Reality and Chimeras in the Preservation of Electronic Records
Bearman, David In: D-Lib magazine (Date Created: 15 Apr 1999)
This magazine column is a response to the report to CLIR by Jeff Rothenberg “Avoiding Technological Quicksand”. Bearman gives a critique of the solution described by Rothenberg and suggests that migration is still the preferred solution for the preservation of electronic records.
http://www.dlib.org/dlib/april99/bearman/04bearman.html
D-Lib Magazine April 1999, Vol 5, No. 4

Using Emulation to Preserve Digital Documents
Rothenberg, Jeff (Date Created: Jul 2000) (Netherlands)
This book considers the problem of how to preserve digital documents from the perspective of the deposit library community. It discusses the theoretical and practical issues involved in using emulation as the prefered preservation strategy. There is also a theoretical cost comparison of emulation and migration.
ISBN: 906259145-0
Koninklijke Bibliotheek, PO Box 90407, 2509 LK The Hague, email: info@kb.nl

Top

Organisations and Websites

The Pixel Rosetta Stone: Packings and Colorspaces
Pirazzi, Chris (Last Updated: 6 Aug 1998)
This site is an example of how bytes and pixels can be described to provide a method of interpreting a bit stream of video.
http://toolbox.sgi.com/TasteOfDT/documents/video/lurker/packings/
Also available at: http://reality.sgi.com/cpirazzi/lg/packings/

Top

Policies, Strategies & Guidelines

A Strategic Policy Framework for Creating and Preserving Digital Collections
Beagrie, Neil ; Greenstein, Daniel (Date Created: 14 Jul 1998) (United Kingdom)
This study presents thirteen recommendations in the areas of long-term digital preservation, standards, the policy framework, and future research. Six case studies highlight some of the real-life considerations concerning digital preservation.
http://ahds.ac.uk/manage/framework.htm

Top

Projects and Case Studies

CAMiLEON: Creative Archiving at Michigan and Leeds: Emulating the Old on the New
University of Michigan and University of Leeds (Date Created: Oct 1999) (England/ USA)
Project CAMiLEON (Creative Archiving at Michigan & Leeds: Emulating the Old on the New) is aiming to look at the issues surrounding the implementation of technology emulation as a digital preservation strategy. The project recognizes emulation’s potential to retain the functionality and “look and feel” of digital objects created on now obsolete systems. It hopes to develop tools, guidelines and costings for emulation as compared to other digital preservation options.
http://www.si.umich.edu/CAMILEON/index.htm

The Joint NSF/JISC International Digital Libraries Initiative
Wiseman, Norman; Chris Rusbridge; Griffin, Stephen In: D-Lib magazine (Date Created: Jun 1999)
“The overall goals of the JISC/NSF program are to foster common approaches to shared problems, promote common standards, share expertise and experience and build on complementary organisational strengths and approaches in digital library research.” Practical studies of the emulation strategy for digital preservation are planned as part of a collaborative effort involving the CEDARS project team and researchers at the University of Michigan with funding through the Joint NSF/JISC International Digital Libraries Initiative.
http://www.dlib.org/dlib/june99/06wiseman.html

UPF (Universal Preservation Format) Home Page
UPF, WGBH (United States of America)
Sponsored by the WGBH Educational Foundation and funded in part by a grant from the National Historical Publications and Records Commission of the National Archives, the Universal Preservation Format initiative advocates a platform-independent format that will help make accessible a wide range of data types. The UPF is characterized as “self-described” because it includes, within its metadata, all the technical specifications required to build and rebuild appropriate media browsers to access contained materials throughout time. The project has produced a Recommended Practices document.
http://info.wgbh.org/upf/

6 Likes

This is excellent and these concerns are very much what drives me.

So much this. Even here in New Zealand, not really even being “off the grid”, it is very easy to take a five minute drive from a city, wander into a low-cellphone-coverage zone and suddenly WHAM all your mobile apps turn into paperweights. And when mobile internet is available, it’s very expensive (much more so here than in the USA, I believe). On mobile, you live like a nomad, shutting from one WiFi hotspot to another, and yet a mobile OS just isn’t built for this at all.

Everything in software being routed through San Francisco means that app developers massively overestimate how fast, cheap, available, and trustworthy Internet connectivity is for people in the rest of the planet.

Repair is a relatively clearly defined problem space for hardware, but for software it’s kind of a foreign concept. The idea of a centralized development team “releasing” software out into the world at scale is built into our tools, technologies, and culture at every level. You generally don’t repair software, because in most cases you don’t even have the source code, and even if you do (and the software doesn’t depend on some server component) there’s always going to be a steep learning curve to being able to make meaningful changes to an unfamiliar code base, even for seasoned programmers.

And this, yes. How do we create a culture of repairable software? Free Software / Open Source tried to do this, and modifying your software not being strictly illegal is a first start, but we still need languages and OSes designed for repair.

In my opinion, one force that’s worked strongly against repairable software is the “design” movement. As a movement, it seems to be based on the idea that “the user does not in fact know best, and needs to have choices taken away for their own good”. I don’t know why this doctrine became popular even in the Open Source world, but it did. I point to GNOME 2 in 2002 as when this idea started to take hold. Specifically, the years between GNOME 2 (2002) and Ubuntu 4.10 (2004) - does anyone else remember the argument over “spatial browsing”, and that the success of Canonical and Ubuntu was largely a rebellion against the GNOME 2 team forcing MacOS-like “spatial” semantics on a userbase which found it unusable? (And then Canonical went on to act like a mini-Apple and force other unwelcome changes on users.)

I really want to understand what happened with GNOME 2 and why such a huge disconnect arose so early between users and professional “user experience designers”. To some level I understand that it happened because KDE had patent issues, GNOME 1 was criticized for being too complex and user-hostile, there was a race to try to displace Windows from the desktop, and so everything that wasn’t “selling” Linux as a drop-in copy of Windows was jettisoned. At another level, there was a lot of starry-eyed admiration for Apple’s OSX in the early 2000s open source world. But ultimately this was still a failure of the open vision, and designers led the way (as they did at Apple) away from openness to locked-down, curated, “user experiences” with no user-servicable parts inside. To make software repairable again, we need to somehow unpick the parts of this shift that went wrong while leaving the parts that went right.

1 Like

Thanks a lot @neauoire for posting this. I guess I’ll need a few days to digest all this, not counting the following few weeks for exploring the links!

So far, I had followed collapse computing from a long distance, more like a curiosity. I hadn’t realized how much its more concrete subgoals overlap with what I would like to see happen for different reasons (mostly related to shifting agency from large institutions to small ones, including individuals as the extreme case).

3 Likes

I finally managed to have at least a quick look at everything referenced here. Lots of good stuff.

My personal view of collapse is that it is already underway, though progressing slowly for now. It started around the year 2000, the initial driver being overexploitation of what modern management calls “human resources”. Contrary to popular discourses about “capitalism” being the villain, I see overexploitation as a much more wide-ranging phenomenon, touching not only workers and employees. As an example, here in France, members of parliament work until late at night reviewing and voting law proposals, and yet complain that they often hardly have enough time to read the proposals from top to bottom. Then Covid came along and showed us how little resilience our current economic and administrative structures have. And the Ukraine war illustrates where fights over energy and material resources will lead in the coming years.

I had filed away “collapse computing” as people thinking about how to salvage old equipment in a scenario of rapid decline. I don’t find that particularly interesting to consider. If rapid decline ever happens, then I doubt that using the last working computers is of much interest to the population.

What I care about in the relation of computing to collapse is how computing and communication technology can be put in the service of emerging more resilient social structures, to enable them to take over when nation states and corporations crumble under their own weight. What I found in the various resources in this thread is a lot of overlap with ongoing collapse computing efforts. Even the most extreme ones (CollapseOS for example) also enable smaller structures to manage sovereign computing technology, except for building the hardware. In retrospect, this isn’t that surprising, the common aspect being the non-dependence on heavy infrastructure (cloud etc.).

3 Likes

My interest in this topic comes from living on the ocean, often away from things. It gives us a preview, in short burts, to what salvage and collapse computing might look like. The lack of internet for weeks and months at a time, and the nearest computer store(we don’t have a mailing address) being miles and miles away, has us really consider the four concepts of resilience.

Ultimately, our take on this, is that computers are not made to last, and repurposing them is near impossible without equally or more complicated tech, so we have paper copies(charts, books, navigation aids, almanachs, etc…) of everything that we need to get to where we want to go. I haven’t found, yet, truly resilient computers that might survive anything like a prolonged supply chain collapse.

The PADI resources I’ve shared above(which now only exist on wayback machine), are quite similar to Unesco’s digital preservation portal(which is now offline) in showing that even that people with resources fail to create a truly resilient model of computing.

Collapse computing is the friends we made along the way.

2 Likes

NASA knew how to make computers that last, such the one in Voyager. Mass-market electronics can’t be relied on for more than 20 years, today’s computers probably even less. Paper is definitely more durable!

2 Likes

Me too. This is the goal.

On collapse computing, I have friends who are actually against technology altogether and I am trying to convince them that technology under our control and direction is useful. I like the term convivial technology, but i also like the absence of technology in big parts of our time, to connect with nature and our physical bodies.

We need to discuss two things
a) what is convivial technology.
b) how will technology allow us to create the social structures that will replace nations, enterprises etc. I have heard that a social structure needs to be at least as complex as its environment. How cane we enable that?

4 Likes

Point 1 deserves a distinct thread in this forum. But not today…

As for point 2: the main argument I am aware of for lower limits to complexity is Ashby’s law of requisite variety from cybernetics. It’s about full control over a system, in a technological setting, so it doesn’t apply directly to social structures, but the general idea still holds: control over an aspect of the world requires a good-enough model of that aspect of the world.

That’s in fact why I expect nation states to crumble under their own weight. The structure of today’s Western representative democracies goes back to the 18th century. Since then, nation states have seriously enlarged their scope of control. In the 18th and 19th centuries, states didn’t care about the economy, nor about the health of their citizens, to name but two aspects. The governance structures from back then don’t scale up to the complexity of today’s world.

1 Like

States can be extended with distributed sensors and limited distributed control, so the question that will arise will be whether we want to replace the control of the state by new democratic structures or not.

For example, Dirk Helbing wants to augment the state.

I don’t know Helbing’s work specifically, but in general, yes, evolving the state is a possible way out. In a way, that has already happened in the past. Representative democracy was set up to deal with political issues, i.e. conflicts of interest, collective vs. individual interests, etc. It is not well adapted to questions requiring expertise. Therefore today’s nation states consult experts, but those experts are not under democratic supervision. We have seen in the Covid pandemic how this arrangement fails when there is insufficient consensus on some question.

Concerning the future, I doubt that states can be extended fast enough to deal with all of today’s issues, and if they do, it’s likely to happen undemocratically, as with experts. Unless broad consensus can be established (which, given social media, I doubt), this won’t work out.

That is what they want to do, have experts, or to have distributed systems from which the state can extract information.

FuturICT was the better project. There were others, which I do not remember now, which were only extractive with full control of the data.

I agree though that we do have an opportunity here, because of the inability of the current social structures to adapt to the environment. This is my main point when they ask me why I believe that what I do will have any effect.

Most of all, I don’t believe in the possibility of designing social systems top-down. An existing social structure can create a smaller annex by design, as in states naming expert committees. But important structural changes happen by bottom-up growth. Which is why I want small structures to have access to computing technology that they can make work as they see fit.

3 Likes

I agree in general, but I would also welcome designs that are done democratically, experts supporting the public.

For example, how do we design a public park or an airport or traffic lights? We should put people in front of the process, but experts are needed.

My main concern with most complex system research is that they ignore the people and democracy altogether.

2 Likes

It’s not an easy problem to solve. Ideally, you want experts that are trusted by the citizens but who also have verifiable expertise. I have seen various proposals, but the only way to move forward would be to test them in real life, in a small setting, but that is not happening. Doing experiments with governance structures is absolutely not part of our culture. And of course not in the interest of those who are in power.

2 Likes

In follow-up to this I found this interesting article about making the best of internet as you get it in Antarctica, found on Hacker News here:

2 Likes