Permacomputing
Situating the idea of permaculture within the context of computing.
Bookmarks
- KDE Eco Handbook. Well researched article explaining how software can be part of the solution, but now is part of the problem.
- Permacomputing. The original introduction to the concept.
- XXIIVV - Permacomputing. Another exploration of the concept that especially looks at the permanence of software.
Log Entries
07.09.2024. Great Software is Simple on Many Planes of Abstraction
Bogotá, Colombia
There is a dichotomy in software development, high-level and low-level software. This is particularly true in programming languages, where you have low-level languages that give you more control, but require more understanding of how computers work, and high-level languages that allow you to express your ideas more simply and allow you to not think about low-level details.
High-level software is called that because it operates at higher-levels of abstractions. This means that it obscures the lower level machinery of what happens, so that we can focus on the task at hand. For example, we would not want to think about how a web browser forms a TCP/IP connection, communicates with a DNS server etc. when opening a web page; at least we do not want to up until the point where we have some error.
High-level software unlocks the ability for us to perform more advanced tasks quicker with less mental overhead. It can also unlock a great deal of improved security, as it can restrict certain operations from the user and better adapt to their needs, which even if it removes some liberty is still desirable.
When we only focus on building a good high-level experience, it comes at the expense of low level control that becomes inaccessible or too complex for regular users. This invites us to be inefficient, construct false understandings, and without a deep understanding of our systems, we can not meaningfully fix the system nor optimize it. It invites us to outsource this control to experts, who are often restricted to building general solutions for many use-cases. Poor understanding often leads to rebuilding the same solutions over-and-over again, with the solutions becoming even more complex and inefficient each time. When only experts understand systems, it centralizes knowledge and power and makes the general population unaware of issues. I explored the democratic problems it leads to in my essay The Curse of Convenience.
Low-level tools gives us greater control and greater freedom. Understanding the lower-level units makes it easier for us to understand how everything fits together, and gives us the power to make the changes we wish. However, using only low-level tools makes it more difficult to cleanly express ideas and it can be a frustrating experience. It sometimes requires digging through manuals and the time it takes to accomplish tasks becomes much longer. It is also easier to make mistakes.
This dilema leads me to solutions that attempt to be simple on many planes of abstractions. In programming languages such as Go and OCaml, the high-level semantics are simple, but a user can still also understand the lower-level details of what happens, which makes it possible to operate at a lower-level when necessary without being an expert. In software, Unix utilities find the balance of being simple and allowing users to express high-level ideas. Older motorcycles are easy to operate, and also easy to fix when necessary.
- Marc
07.09.2024. Communal Computing
Bogotá, Colombia
One of the most beautiful ideas behind the original Unix, that I think unfortunately has gotten lost and is underrated, was the idea of a form of collective computing. People would gather as a group and collectively build their tools specific to what their community needs. The way Dennis Ritchie described it:
What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication.
Using a collection of simple tools, users would then be able to combine these together to build the tools specific to their needs on time-shared machines.
Another obvious advantage to collectively owned computers is that you retain ownership from the bigger companies, while at the same time can unlock better optimization where these computers can live in geographically advantageous regions. For example, Solar Protocol directs users to whichever server has the most sunlight.
There are a lot of advantages to empowering users to fix issues themselves rather than someone fixing their problems for them. I wrote about it extensively in my essay The Curse of Convenience. I also see with the new LLM models a resurgence of this idea in Maggie Appleton's essay about home-cooked software. Personally, I am skeptical that LLM's will enable this revolution, but I think her essay is still worth a read!
Today, there is a communal computing system that exist, it is the SDF. It is quite old and has been around since the 1987, and it definitely marketed towards a technical audience. It hosts a set of collective computers that any member can use for any purpose (within reason). With it people have set up a Lemmy instance and Mastodon instance. You also get a free email account with it. It also comes with a shell you can SSH into and do any kind of programming that you want.
Personally, I use SDF to host my notes with git. Doing that was as
simple as ssh user@tty.sdf.org -t mkdir notes && cd notes && git init
. Once done, I am able to access these notes from my phone, my
laptop or wherever I happen to be. To clone it locally, I just run git clone user@tty.sdf.org:~/notes
. I also hang out at their Lemmy instance.
- Marc
04.09.2024. Human Readable File Formats
Bogotá, Colombia
I have recently (ish) become interested in file formats. In particular, file formats that are human readable and human writable.
These formats have quite a few advantages:
- They are (obviously) modifiable by hand
- They are easy to backup
- They are easy to maintain
- They are (often) easy to build software for.
But they also come with challenges:
- Human-writable means that data integrity is not guaranteed
- They are inefficient
To me the biggest beauty of these file formats is that they can outlive the software that created them. Even if I am on a foreign computer, without internet, hit with amnesia, I can still make sense of and modify these formats.
Software in some way or another takes data and outputs data, that's what a computer is meant to do. I think it is worth thinking about how we can make sure that the data generated outlives the software that made it, inspired by Permacomputing.
I kicked off a thread on Mastodon to see what kinds of human-readable data formats people know of. I am excited to see what people share.
- Marc