The effective accelerationist(e/acc) crowd on 𝕏, which seems to be steadily growing, is focused solely on one kind of singularity: a techno-capital singularity. This is a singularity that’s slightly different from the concept mathematician John von Neumann proposed in the 1950s.

Before we proceed, let’s define what a singularity really is. Mathematically, it’s a point at which an equation becomes undefined or behaves unusually. In other words, it’s a point at which an equation breaks down and becomes unusable.

John von Neumann’s technological singularity is therefore a point at which the equation we use to predict the future of humanity breaks down. The cause for this is an explosion in technological progress that so fundamentally changes our civilization that we cannot look beyond it today.

A techno-capital singularity includes both technology and economics in the equation. As suggested in Nick Land’s text Meltdown, this is a singularity where techno-economic progress destroys the current social order and fundamentally changes politics. He goes on to suggest that the biosphere is going to be turned into a technosphere but that, I assume, is merely a conjecture because, by definition, one cannot see what happens after a singularity.

There are large groups of humans who are arguing for and against accelerationism, ignoring several other potential singularities that lie ahead of us. In this article, I’d like to focus on those. Many of these singularities don’t even have names yet, so you’ll have to make do with the makeshift names I give them.

Sociopolitical Singularity

This is a singularity that involves a fundamental shift in cultural values, norms, beliefs, and worldviews of most individuals in our civilization. This could be driven by new developments in philosophy and education that lead to radically new and addictive collective experiences. These experiences could precipitate historical events that impact entire nations.

This kind of singularity suggests that a philosophy department, a religious organization, or a random group of individuals on a social media platform, comes up with an ideological contagion, a memetic virus, that our civilization has no immunity to.

These memetic viruses are far from rare. Religion is an excellent example, says Richard Dawkins. It infected most of our ancestors tens of thousands of years ago. The “tide pod challenge” is a more modern, and weaker, example.

If some individual comes up with a new memetic virus as strong as religion, there’s no way we can predict what happens to our civilization once it infects most individuals. The ones who refuse to accept it could be considered heretics and, in the worst case, burnt at the stake. In the best case, they might not find any partners to reproduce with.

Environmental Singularity

Our way of life has always been influenced by nature. And until we become a Kardashev scale type I civilization, it’ll continue to be. We’ve already passed several grim milestones in the context of climate change.

We don’t seem to be slowing down. And I, as a proponent of e/acc myself, don’t think it’s a good idea to slow down. We need to get to type I status as soon as possible. But until we do so, we are in a liminal state. The risks we face are going to continue to increase rapidly.

We could cross some dangerous tipping point and trigger runaway climate change due to feedback loops. This could render huge swathes of the planet uninhabitable, possibly because they are hot deserts or under water. Losing the ability to generate food for billions of humans is going to be a disaster of similar scale. Resource scarcity is known to force societies into wars and other such situations detrimental to technological progress. But in reality, there’s no telling how our civilization is going to cope. We cannot see beyond this singularity either.

I’m going to include cosmic events in this kind of singularity. If our planet were to be hit by a large-enough meteorite or comet, our civilization would crumble instantly.

Astrobiological Singularity

While we discuss artificial superintelligence(ASI) extensively, we ignore extraterrestrial superintelligence. ET intelligence, currently, is just as elusive as ASI.

If extraterrestrials were to arrive on our planet, the equation we use to predict the future of our civilization breaks down again. Our civilization would be influenced by an intelligence that’s alien to us. Regardless of how helpful or hostile they are, our way of life would change rapidly.

Techno-physical Singularity

If there are advances in physics that result in the development of a propulsion system that’s far superior to current ones, it would change our civilization in ways we cannot imagine.

Being able to travel at some significant fraction of the speed of light at low costs would unlock resources of the entire solar system, possibly even our interstellar neighborhood.

Rapid travel to distant celestial bodies would fundamentally transform our concepts of economy and resource management. How is this going to change human relationships and social hierarchies? Hard to guess.

Of course, any advance in physics is likely going to result in a cascade of further advances that we are completely unaware of now. How are those going to change our society? Those are unknown unknowns.

I could go on. Exploring the realm of hypothetical singularities is my favorite pastime. Note that every single singularity we know of, even the techno-capital one, is still only hypothetical. As a programmer, however, that’s the only singularity I keep reading about. But our physicists are definitely not sitting idle. Nor have our factories and farms slowed down influencing the climate. And if grabby aliens exist, surely they’re not slowing down either. We have a fun future ahead, and all I can say is “accelerate!”

Leave a Reply

Your email address will not be published. Required fields are marked *