In short, collapse computing is about taking advantage of today’s abundance in computing power to prepare for a future in which infrastructures have collapsed. A post-collapse society that has eventually lost all of its artificial computing capacity may still want to continue the practice of computer science for various reasons.
This is not so much to address the question of What computing is for. I thought it might be fun to have a dedicated thread on the topic. There seems to be overlaps between permacomputing and malleable systems where the knowability of a project, necessarily for it to be malleable, goes hand-in-hand with software preservation and software resilience.
Collapse computing prioritizes community needs and aims to contribute to a knowledge commons in order to sustain the practice of computation through infrastructure collapse, it is the practice of engaging with the discarded with an eye to transform what is exhausted and wasted into renewed resources.
CollapseOS’ goals
I think it covers some good points so I’ll share them:
- Run on minimal and improvised machines.
- Interface through improvised means (serial, keyboard, display).
- Edit text and binary contents.
- Compile assembler source for a wide range of MCUs and CPUs.
- Read and write from a wide range of storage devices.
- Assemble itself and deploy to another machine.
Designing for Descent ensures that a system is resilient to intermittent energy supply and network connectivity. Nothing new needs producing and no e-waste needs processing. If your new software no longer runs on old hardware, it is worse than the old software. Software should function on existing hardware and rely on modularity in order to enable a diversity of combinations and implementations. It is about reinventing essential tools so that they are accessible, scalable, sturdy, modular, easy to repair and well documented.
Collapsology
The following comes from this excellent series on Collapse Computing.
Collapsology studies the ways in which our current global industrial civilization is fragile and how it could collapse. It looks at these systemic risks in a transdisciplinary way, including ecology, economics, politics, sociology, etc. because all of these aspects of our society are interconnected in complex ways. I’m far from an expert on this topic, so I’m leaning heavily on the literature here, primarily Pablo Servigne and Raphaël Stevens’ book “How Everything Can Collapse” (translated from the french original).
So what does climate collapse actually look like? What resources, infrastructure, and organizations are most likely to become inaccessible, degrade, or collapse? In a nutshell: Complex, centralized, interdependent systems.
Failure Scenarios
So to summarize, this is a rough outline of a potential failure scenario, as applied to computing:
- No new hardware: It’s difficult and expensive to get new devices because there’s little to no new ones being made, or they’re not being sold where you live.
- Limited power: There’s power some of the time, but only a few hours a day or when there’s enough sun for your solar panels. It’s likely that you’ll want to use it for more important things than powering computers though…
- Limited connectivity: There’s still a kind of global internet, but not all countries have access to it due to both degraded infrastructure and geopolitical reasons. You’re able to access a slow connection a few times a month, when you’re in another town nearby.
- No cloud: Apple and Google still exist, but because you don’t have internet access often enough or at sufficient speeds, you can’t install new apps on your iOS/Android devices. The apps you do have on them are largely useless since they assume you always have internet.
Assumptions to Reconsider?
Core parts of the software development workflow are increasingly moving to web-based tools, especially around code forges like Github or Gitlab. Issue management, merge requests, CI, releases, etc. all happen on these platforms, which are primarily or exclusively used via very, very, slow websites. It’s hard to overstate this: Code forges are among the slowest, shittiest websites out there, basically unusable unless you have a fast connection.
This is, of course, not resilient at all and a huge problem given that we rely on these tools for many of our key workflows.
However, while free software has structural advantages that make it more resilient than proprietary software, there are problematic aspects of current mainstream technology culture that affect us, too. Examples of assumptions that are pretty deeply ingrained in how most modern software (including free software) is built include:
- Fast internet is always available, offline/low-connectivity is a rare edge case, mostly relevant for travel
- New, better hardware is always around the corner and will replace the current hardware within a few years
- Using all the resources available (CPU, storage, power, bandwidth) is fine
Build for Repair
In a less connected future it’s possible that substantial development of complex systems software will stop being a thing, because the necessary expertise will not be available in any single place. In such a scenario being able to locally repair and repurpose hardware and software for new uses and local needs is likely to become important.
Repair is a relatively clearly defined problem space for hardware, but for software it’s kind of a foreign concept. The idea of a centralized development team “releasing” software out into the world at scale is built into our tools, technologies, and culture at every level. You generally don’t repair software, because in most cases you don’t even have the source code, and even if you do (and the software doesn’t depend on some server component) there’s always going to be a steep learning curve to being able to make meaningful changes to an unfamiliar code base, even for seasoned programmers.
In a connected world it will therefore always be most efficient to have a centralized development team that maintains a project and makes releases for the general public to use. But with that possibly no longer an option in the future, someone else will end up having to make sure things work as best they can at the local level. I don’t think this will mean most people will start making changes to their own software, but I could see software repair becoming a role for specialized technicians, similar to electricians or car mechanics.
Efficient vs. Repairable
The goals of wanting software to be frugal with resources but also easy to repair are often hard to square. Efficiency is generally achieved by using lower-level technology and having developers do more work to optimize resource use. However, for repairability you want something high-level with short feedback loops and introspection, i.e. the opposite.
An app written and distributed as a single Python file with no external dependencies is probably as good as it gets in terms of repairability, but there are serious limitations to what you can do with such an app and the stack is not known for being resource-efficient. The same applies to other types of accessible programming environments, such as scripts or spreadsheets. When it comes to data, plain text is very flexible and easy to work with (i.e. good for repairability), but it’s less efficient than binary data formats, can’t be queried as easily as a database, etc.
Related Efforts
- Dusk OS
- Tumble Forth
- Public Domain Books to Restart Computer Technology
- Civboot: a civilizational bootstrapper
- Simplifier
- Sci.Electronics.Repair FAQ
- The Vintage Technology Digital Archive
- A big collection of Apple-related documentation
- Daniel Marks’ electronic designs focusing on resilience
- Michael Schierl’s UXN port of Collapse OS
- Deadly Optimism, Useful Pessimism
- UVC Archiving
Articles on preservation
-
Avoiding technological quicksand: finding a Viable Technical Foundation for Digital Preservation
Rothenberg, Jeff In: CLIR Reports (Date Created: 15 Jan 1998) Reviewed by: Reality and Chimeras in the Preservation of Electronic Records
This report to the Council on Library and Information Resources discusses digital preservation strategies and recommends emulation as the best possible solution to the digital preservation challenge. Rothenberg advocates the development of specifications to build an emulator for a digital document and its associated software, which are stored with the document. The work of building the emulator is not done until the document needs to be accessed.
http://www.clir.org/pubs/reports/rothenberg/contents.html -
A blueprint for Representation Information in the OAIS model
Holdsworth, David; Sergeant, Derek In: CURL Exemplars for Digital ARchiveS (CEDARS) Project
This paper describes the current scheme for representation information employed in the CEDARS project.
http://gps0.leeds.ac.uk/~ecldh/cedars/nasa2000/nasa2000.html
Carrying Authentic, Understandable and Usable Digital Records Through Time
Rothenberg, J.; Bikson, T.; RAND-Europe (Date Created: 6 Aug 1999) (The Netherlands)
This report to the Dutch National Archives and Ministry of the Interior presents the results of a study to define a strategy and framework for the long-term management and preservation of digital governmental records. A “testbed” in which specific digital preservation techniques can be prototyped and evaluated is proposed.
http://www.archief.nl/digiduur/bibliotheek/final-report_4.pdf
Digital Preservation & Emulation: From Theory to Practice
Granger, Stewart (Date Created: 2001)
To be presented at the September 2001 ICHIM Conference, this paper examines emulation as a digital preservation strategy. As well as drawing on mathematical theory, the author analyses the practical role of emulation in the preservation of digital materials.
http://dspace.dial.pipex.com/stewartg/sgichim.htm
Digital Preservation: A Report From the Roundtable Held in Munich, 10-11 October 1999
Beagrie, Neil; Elkington, Nancy In: RLG DigiNews (Date Created: 15 Dec 1999) (Germany)
This report about the Roundtable meeting held in Munich in October 1999, hosted by the Bavarian State Library and the German National Library, provides a brief summary of current research and programmes for digital preservation and an overview of the situation within Germany itself.
http://www.rlg.org/preserv/diginews/diginews3-6.html#confrep
In RLG DigiNews Volume 3, Number 6
Digital Preservation: State of the Art November 1999 Update
Fresko, M. (Last Updated: 16 Dec 1999)
This update on previous research, was prepared for the November 1999 concertation meeting, Consolidating the European Library Space.
http://www.cordis.lu/libraries/events/fp4ce/speech/hs~cox.html
Digital Rosetta Stone: A Conceptual Model for Maintaining Long-term Access to Digital Documents
Heminger, Alan R.; Robertson, Steven B. (Date Created: 17 Jun 1998)
This document describes a method for digital preservation relying on keeping a “meta-knowledge archive” of how to interpret media formats and file formats to support data recovery and document reconstruction processes.
http://www.ercim.org/publication/ws-proceedings/DELOS6/rosetta.pdf
Emulation : C-ing Ahead
CAMiLEON Project; Holdsworth, David (Date Created: Dec 2000)
A draft paper by David Holdsworth advocating the use of widely available programming languages, such a C, in the implementation of emulation as a digital preservation technique.
http://www.leeds.ac.uk/CAMiLEON/dh/cingahd.html
Emulation as a Digital Preservation Strategy
Granger, Stewart In: D-Lib magazine (Date Created: Oct 2000)
An article focusing on the logistics of using emulation as a digital preservation strategy when contrasted against other possible approaches and user requirements. Extensive analysis is undertaken of Jeff Rothenberg’s advocacy of emulation over other preservation strategies. It discusses the potential advantages of emulation, such as elegance and recreation of " look and feel ", and the possible disadvantages including that emulation may be problematic due to the undefined nature of technological change and the complexity of creating emulator specifications. The conclusion is that emulation is not a complete digital preservation solution, rather a partial one.
http://www.dlib.org/dlib/october00/granger/10granger.html
D-Lib Magazine October 2000, Vol. 6, No. 10
Emulation, Preservation and Abstraction
Holdsworth, David; Wheatley, Paul; CAMiLEON Project (Date Created: 2000)
An article by David Holdsworth and Paul Wheatley outlining the advantages of emulation as a digital preservation strategy. The issues of longevity and affordability are stressed, and an example of emulation as applied to the George3 operating system is provided.
http://www.leeds.ac.uk/CAMiLEON/dh/ep5.html
Emulation: RISC’s Secret Weapon
Halfhill, Tom R.; BYTE.com (Date Created: 15 Apr 1994)
This paper is a discussion of emulation technology in a commercial rather than a preservation context. The first section contains a general technical background about how emulation works. The article continues with a detailed discussion of emulation products available at the time (1994), mainly for emulation of contemporary machines on opposing platforms. The opinion is expressed that there are problematic issues associated with emulation such as slow speed but that it also has potential for future use.
http://www.byte.com/art/9404/sec8/art3.htm
Byte Special Report, April 1994
An Experiment in Using Emulation to Preserve Digital Publications
Rothenberg, Jeff (Date Created: Apr 2000) (Netherlands)
This report presents the results of a preliminary investigation by the National Library of the Netherlands into the feasibility of using emulation as a means of preserving digital publications in accessible, authentic, and usable form within a deposit library.
http://www.kb.nl/nedlib/results/emulationpreservationreport.pdf
Long-term Preservation of Electronic Publications: the NEDLIB Project
Titia van der Werf-Davelaar, Koninklijke Bibliotheek, The National Library of the Netherlands In: D-Lib magazine (Date Created: Sep 1999)
This article describes the NEDLIB project and explains how it is developing a deposit system for electronic publications (DSEP) on the basis of the OAIS model. Metadata for preservation and emulation experiments are also discussed.
http://www.dlib.org/dlib/september99/vanderwerf/09vanderwerf.html
Metadata for digital preservation: an update
Day, Michael In: Ariadne (Date Created: Dec 1999)
In this article, Michael Day reviews recent activities related to the development of metadata schemes for digital preservation. He outlines the emulation and migration-based strategies for digital preservation and the role that preservation metadata is poised to play in both these approaches. He reviews both library and archives-based initiatives in this area. He describes the OAIS model and its implications for the development of preservation metadata.
http://www.ariadne.ac.uk/issue22/metadata/
In Ariadne, Issue 22
Metadata to Support Data Quality and Longevity
Rothenberg, Jeff (Last Updated: 3 Jun 1996)
This paper discusses two key needs for metadata: to support data quality and to ensure the longevity of data. In the latter category, the need for encapsulation with metadata which will enable future emulation is discussed.
http://www.computer.org/conferences/meta96/rothenberg_paper/ieee.data-quality.html
Migration : a CAMiLEON discussion paper
CAMiLEON Project; Wheatley, Paul (Date Created: 2000)
A paper by Paul Wheatley considering the different options for migration as a digital preservation technique.
http://www.personal.leeds.ac.uk/~issprw/camileon/migration.htm
Preservation 2000
Michael Day In: Ariadne (Date Created: Dec 2000)
This paper is a review of the Preservation 2000 conference and the Information Infrastructures for Digital Information workshop held in York, UK in December 2000.
http://www.ariadne.ac.uk/issue26/metadata/
Preserving Information Forever and a Call for Emulators
Gilheany, Steve; Archive Builders (Date Created: 17 Mar 1998)
This paper reviews some of the issues associated with planning to store information indefinitely. Long-term preservation must include a plan for preserving metadata as well as data. This paper also discusses the need for emulators to permanently preserve the functionality of the computer.
http://www.archivebuilders.com/aba010.html
Presented at the Digital Libraries Conference and Exhibition The Digital Era: Implications, Challenges and Issues, 17-20 March 1998, Singapore
Reality and Chimeras in the Preservation of Electronic Records
Bearman, David In: D-Lib magazine (Date Created: 15 Apr 1999)
This magazine column is a response to the report to CLIR by Jeff Rothenberg “Avoiding Technological Quicksand”. Bearman gives a critique of the solution described by Rothenberg and suggests that migration is still the preferred solution for the preservation of electronic records.
http://www.dlib.org/dlib/april99/bearman/04bearman.html
D-Lib Magazine April 1999, Vol 5, No. 4
Using Emulation to Preserve Digital Documents
Rothenberg, Jeff (Date Created: Jul 2000) (Netherlands)
This book considers the problem of how to preserve digital documents from the perspective of the deposit library community. It discusses the theoretical and practical issues involved in using emulation as the prefered preservation strategy. There is also a theoretical cost comparison of emulation and migration.
ISBN: 906259145-0
Koninklijke Bibliotheek, PO Box 90407, 2509 LK The Hague, email: info@kb.nl
Organisations and Websites
The Pixel Rosetta Stone: Packings and Colorspaces
Pirazzi, Chris (Last Updated: 6 Aug 1998)
This site is an example of how bytes and pixels can be described to provide a method of interpreting a bit stream of video.
http://toolbox.sgi.com/TasteOfDT/documents/video/lurker/packings/
Also available at: http://reality.sgi.com/cpirazzi/lg/packings/
Policies, Strategies & Guidelines
A Strategic Policy Framework for Creating and Preserving Digital Collections
Beagrie, Neil ; Greenstein, Daniel (Date Created: 14 Jul 1998) (United Kingdom)
This study presents thirteen recommendations in the areas of long-term digital preservation, standards, the policy framework, and future research. Six case studies highlight some of the real-life considerations concerning digital preservation.
http://ahds.ac.uk/manage/framework.htm
Projects and Case Studies
CAMiLEON: Creative Archiving at Michigan and Leeds: Emulating the Old on the New
University of Michigan and University of Leeds (Date Created: Oct 1999) (England/ USA)
Project CAMiLEON (Creative Archiving at Michigan & Leeds: Emulating the Old on the New) is aiming to look at the issues surrounding the implementation of technology emulation as a digital preservation strategy. The project recognizes emulation’s potential to retain the functionality and “look and feel” of digital objects created on now obsolete systems. It hopes to develop tools, guidelines and costings for emulation as compared to other digital preservation options.
http://www.si.umich.edu/CAMILEON/index.htm
The Joint NSF/JISC International Digital Libraries Initiative
Wiseman, Norman; Chris Rusbridge; Griffin, Stephen In: D-Lib magazine (Date Created: Jun 1999)
“The overall goals of the JISC/NSF program are to foster common approaches to shared problems, promote common standards, share expertise and experience and build on complementary organisational strengths and approaches in digital library research.” Practical studies of the emulation strategy for digital preservation are planned as part of a collaborative effort involving the CEDARS project team and researchers at the University of Michigan with funding through the Joint NSF/JISC International Digital Libraries Initiative.
http://www.dlib.org/dlib/june99/06wiseman.html
UPF (Universal Preservation Format) Home Page
UPF, WGBH (United States of America)
Sponsored by the WGBH Educational Foundation and funded in part by a grant from the National Historical Publications and Records Commission of the National Archives, the Universal Preservation Format initiative advocates a platform-independent format that will help make accessible a wide range of data types. The UPF is characterized as “self-described” because it includes, within its metadata, all the technical specifications required to build and rebuild appropriate media browsers to access contained materials throughout time. The project has produced a Recommended Practices document.
http://info.wgbh.org/upf/