Portability

Giving us the power to choose where we run our software.

Log Entries

24.03.2025 // Portability and Resistance

Uffculme, England

Portability is a very underrated quality.

Typically, we have a set of a few tools that help us get on with our life: maybe that is a GPS, a messaging platform, a design tool, etc. When the tool we use is only available on one platform, then we are coerced into using only that platform and ceding our agency to it. When we have software that is portable, however, we are free to leave a platform and choose another. It therefore empowers the user, giving us a voice (and a choice).

Portability and compatibility tend to be the lowest of priorities for a tech company, that is unless companies find themselves obliged to build software with these two qualities in mind.

One way to make modern toolmakers prioritise portability and compatibility is by setting regulation and standards. While some argue against these type of impositions on businesses, portability and compatibility standards would also benefit companies as they would also be able to make use of the diverse tools built around these standards. However, companies would need to relinquish control, the very same control that has been taken away from users. This generates an interesting phenomenon by which adding some restrictions to what we can do when we run businesses, allows us more choice and freedom in other ways.

Another force to provoke change is through consumer demand and the market. The dominance of Internet Explorer once made it so that developers accounted only for the specifications and needs to run on Internet Explorer, making access to the internet via other browsers more difficult, and at times impossible. Internet Explorer could also leverage its dominance to set standards that permitted it to perpetuate its monopoly and, in a way, force individuals to use it. However, the growth of alternative browsers eventually forced developers to account for more than one browser. It also forced Internet Explorer to start playing nice, as it could no longer solely influence the established standards.

It might be worth asking ourselves how portable a piece of software is when we choose to use it, as well as reflect on the degree to which depending on certain software can lock us into one way of doing things or into a closed ecosystem controlled by a single vendor. And not just the software itself, but also the data it generates.

- Marc

07.09.2024 // Great Software is Simple on Many Planes of Abstraction

Bogotá, Colombia

There is a dichotomy in software development, high-level and low-level software. This is particularly true for programming languages, where you have low-level languages that give you more control, but require more understanding of how computers work, and high-level languages that allow you to express your ideas more simply and allow you to not think about low-level details.

High-level software is called that because it operates at higher levels of abstractions. This means that it obscures the machinery of what happens within a computer (the "low-level"), so that we can focus on the task at hand. For example, we would not want to think about how a web browser forms a TCP/IP connection, communicates with a DNS server, etc., when opening a web page—at least, we do not want to up until the point where we encounter an error.

High-level software unlocks the ability for us to perform more advanced tasks quicker with less mental overhead. It can also unlock a great deal of improved security, as it can restrict certain operations from the user and better adapt to their needs, which, even if it deprives users of some liberty, can still be desirable.

However, when we only focus on building a good high-level experience, it comes at the expense of low-level control that becomes inaccessible or too complex for regular users. This invites us to be inefficient and construct false understandings. Without a deep understanding of our systems, we cannot meaningfully fix the system nor optimize it. It invites us to outsource this control to experts, who are often restricted to building general solutions for many use-cases. Poor understanding often leads to rebuilding the same solutions over and over again, with the solutions becoming even more complex and inefficient each time. When only experts understand systems, it centralizes knowledge and power away from the general population.

I explored the democratic problems it leads to in my essay The Curse of Convenience.

Low-level tools give us greater control and greater freedom. Understanding the lower-level units makes it easier for us to understand how everything fits together, and gives us the power to make the changes we wish. However, using only low-level tools makes it more difficult to cleanly express ideas and it can be a frustrating experience. It sometimes requires digging through manuals and the time it takes to accomplish tasks becomes much longer. It is also easier to make mistakes.

This dilemma leads me to solutions that attempt to be simple on many planes of abstractions. In programming languages such as Go and OCaml, the high-level semantics are simple, but a user can still also understand the lower-level details of what happens, which makes it possible to operate at a lower-level when necessary without being an expert. In software, Unix utilities find the balance of being simple and allowing users to express high-level ideas. Older motorcycles are easy to operate, and also easy to fix when necessary.

- Marc

07.09.2024 // Communal Computing

Bogotá, Colombia

One of the most beautiful ideas behind the original Unix, that I think has unfortunately gotten lost and is now underrated, is the idea of a form of collective computing. People would gather as a group and collectively build tools. The way Dennis Ritchie described it:

"What we wanted to preserve was not just a good environment in which to do programming, but a system around which a fellowship could form. We knew from experience that the essence of communal computing, as supplied by remote-access, time-shared machines, is not just to type programs into a terminal instead of a keypunch, but to encourage close communication."

Using a collection of simple tools, users would then be able to bring these together on time-shared machines and build solutions to meet the needs of their communities.

People sitting in a circle typing on typewriters, connected to the same computer.
Fig 1. Timesharing a Wang 3300 Basic. Source.

Another obvious advantage to collectively owned computers is that you retain ownership from the bigger companies, while at the same time still unlock better optimization permitted by scale. For instance, these collective computers can live in geographically advantageous regions. For example, Solar Protocol directs users to whichever server has the most sunlight.

There are a lot of advantages to empowering users to fix issues themselves, rather than someone fixing their problems for them. I wrote about it extensively in my essay The Curse of Convenience. I also see with the new LLM models a resurgence of this idea in Maggie Appleton's essay about home-cooked software. Personally, I am skeptical that LLM's will enable this revolution, but I think her essay is still worth a read!

Today, there is a communal computing system that exists, it is the SDF. It has been around since the 1987, and it is definitely marketed towards a technical audience. It hosts a set of collective computers that any member can use for any purpose (within reason). With it, people have set up a Lemmy instance and a Mastodon instance. It also comes with a free email account and a shell you can SSH into and do any kind of programming that you want.

Personally, I use SDF to host my notes with git. Doing that was as simple as ssh user@tty.sdf.org -t mkdir notes && cd notes && git init. Once done, I am able to access these notes from my phone or my laptop, wherever I happen to be. To clone it locally, I just run git clone user@tty.sdf.org:~/notes. I also hang out at their Lemmy instance.

Terminal screen showing Welcome to SDF Public Access UNIX system
Fig 2. SDF Public Access System.

- Marc

28.08.2024 // Social Apps with Email

Bogotá, Colombia

Email combined with isync makes it is possible to access email offline and have it synced on a regular interval.

I looked around for options to build a shared TODO list with Andrea and sometimes the best solution is that which is right in front of you. All the local solutions that I have used in the past made sharing difficult, and neither one of us wanted to sign up for some third-party service nor download an app just for TODO items. Then I started thinking a few days ago about how I have always shared links with myself in the past, which was through email.

Well, thanks to fastmail's web filters, I was able to set up a specific email address that Andrea and I could use to share TODO items between each other. All emails sent to that email address end up in my TODO inbox. How do I share TODO items? Well, just add Andrea on CC and then it's done. No sign-up to a new service needed. When the item is complete, I just reply done and then my email rule will automatically drag it to my Done archive.

Similarly, I use this as my social bookmarking service. I have a special email address and a email rule so that when that address receives links from the right people, those links end up automatically in my links archive.

These two solutions work cross-device as well. All my devices have email so sharing across BSD, iOS, and Android becomes trivial. All of them support email.

This could be further extended with interfaces that operate on the isync directory. You could then have TODO apps that use email as a backend, and what is nice is that people would not need to download an app to operate on it, so it would be a form of "progressive enhancement".

- Marc

20.08.2024 // Local-first Software I

Bogotá, Colombia

I read an interesting article on local-first software. I think it perfectly summarizes the issue with cloud software and the need for more offline-friendly software.

I am happy the authors mentioned Git, as it is a prime example of successful offline-friendly collaboration. Github of course ruins it slightly by having PRs be done online.

I was a bit sad, though, that the article did not mention the dvcs fossil, which contains chat and forums built-in that auto-sync when you go online. It is also extremely easy to self-host.

In conversations about local-first, I often find email and/or activitypub to be underrated as well. Email is offline friendly, I have it synced offline. I also think you could extend it to have apps on top of it, like a todo list app. Activitypub could be used to push the envelope even further, as it is a system to stream activities. You could have those activities be signed locally and sent when you go online.

- Marc