What Is the Singularity?

Ted Chu /

Posted on: March 17, 2014 / Last Modified: March 17, 2014

technological singularityI have always had mixed feelings about Singularity becoming the buzz word for the transhumanist movement. I am glad that this catchy word is spreading the idea of the post-human future, opening the eyes of people who cannot imagine a time when human beings are no longer the most intelligent and most powerful in the world. On the other hand, Singularity seems to represent a future when technologies completely overwhelm humans and bring unprecedented changes and risks. For most people, this is a scary future, a future that we cannot understand at all, let alone be a part of. Although technological enthusiasts have tried to convince people that we could become immoral and enjoy material abundance after the Singularity, nobody can deny the existential risks and dangers of unintended consequences.

That is why in my book, Human Purpose and Transhuman Potential, the concept of Singularity is only mentioned in passing. I would like to thank Nikola Danaylov for giving me this opportunity to briefly discuss a couple of issues related to Singularity.

First, I am not sure Singularity as popularly defined by mathematician and science fiction writer Vernor Vinge is even remotely close. During his Singularity 1 on 1 interview, Vinge states again that the Singularity is a time when our understanding of the new reality is like a goldfish’s understanding of modern civilization. If this is the case, I cannot believe that Singularity is near. As a reference, Ray Kurzweil predicts the singularity to occur around 2045 whereas Vinge predicts some time before 2030.

It is true that with accelerating technological change and possible intelligence “explosion”, human beings will understand less and less about technology. But this is nothing radically new. Who knows the technical details of the smartphone or the automobile? At the conceptual level, relativity, quantum mechanics, and the number 10 to the power of 100 are a few things so counter-intuitive that few can truly grasp. But there is something powerful called metaphor, and as long as we can map a complex reality to something we are familiar with, we manage to make sense of it.

I tend to agree with Michio Kaku that with luck and conscious efforts we can attain the Kardashev Type I civilization status within 100–200 years, which means, among other things, complete mastery of energy resources on Earth. If we can describe nuclear power and supercomputers to hunter-gatherers in remote corners of the Earth, why won’t future advanced transhumans be able to describe the Type I civilization to us?

In Chapter 12 of my book, there is a short description of a possible Kardashev Type III scenario, when future intelligent beings obtain the ability to harness the energy of an entire galaxy. I wrote about A Second Axial Age: “With new knowledge and technologies, CoBe (Cosmic Being) may manage to travel at an ever faster speed, maybe even faster than the speed of light. But unless CoBe can discover or invent instantaneous communication and information processing techniques such as so-called wormholes, space will be an absolute barrier to communication and interaction among distant stars and galaxies. In other words, we only know the history—not the current reality—of others living at great distances in space. The future might then be a return to a ‘tribal environment’ in the sense that the instant communications we have established on Earth might no longer be possible. CoBe would be isolated by space, each ‘tribe’ taking time to learn and to evolve.”

If the human being were kept alive as a species then (costing infinitely less than we keep a goldfish alive today), I am sure that super smart CoBe would be able to use some kind of metaphor (much better than my Axial Age metaphor) to communicate with humans about the new reality. Since we have already got a glimpse of the entire universe and its history (what I call the Cosmic View), something truly incomprehensible must be something beyond this universe.

If we define Singularity simply as accelerating technological change and intelligence explosion, then this is nothing new. The pace of evolution has been accelerating from the beginning, both in natural and cultural evolution. This trend should continue with what I call “conscious evolution”. We have been living in a world of information explosion and that has been just fine – each of us takes just a slice of information we like and blissfully ignore the rest. There will be much more intelligence developing and we will deal with it in more or less the same way. Emphasizing the fact that this is nothing new could go a long way in terms of lessening the fear of the post-human future.

We should not only point out that this unprecedented future has a rich history, but also make it clear that this future is highly desirable, and making it happen should be our mission. This is the second point I would like to address: the concept of Singularity fails to provide a positive and transcendental value for us. Modern science has demonstrated that a literal reading of religious scriptures is no longer tenable. However, the secular world has also thrown out the ancient wisdom of transcendental faith and narrowly focused our goal to maximize the well-being of humanity.

As I discussed in Chapter 8, this goal of maximizing human happiness is not only unattainable but also runs against the “will” of the universe for us to complete the transitory role of our species. As I argued in detail in Chapter 4 of my book, science is not value-free, nor is (or should be) the concept of Singularity.

I have not seen a good piece of argument that Singularity should be something that we need to focus our efforts on. I find it very difficult to inject value into this concept as it is popularly defined now. I am glad that Singularity has created a high-level of awareness of the posthuman future, but we must move on from its “neutral” and technological nature. We know technology is a two-edged sword. We need a weapon, but a flag is more important.

 

About the Author:

Ted ChuFormerly the chief economist at General Motors, Ted Chu was also chief economist for Abu Dhabi Investment Authority. For the last 15 years, his second career has been spent in conducting research on the philosophical question of humanity’s place in the universe with special reference to our “posthuman” future. Born and raised in China, Chu earned his Ph.D. in economics at Georgetown University. He is currently clinical professor of economics at New York University at Abu Dhabi.

 

Browse More

The Future of Circus

The Future of Circus: How can businesses and artists thrive in a changing entertainment industry?

The Problem with NFTs preview

The Problem with NFTs [Video]

Micro-Moments of Perceived Rejection

Micro-Moments of Perceived Rejection: How to Navigate the (near) Future of Events

Futurist Tech Conference Preview

Futurist Conferences: Considerations for Progressive Event Professionals

Nikola Danaylov on Ex Human

Nikola Danaylov on Ex Human: the Lessons of 2020

Immortality or Bust preview

Immortality or Bust: The Trailblazing Transhumanist Movie

COVID19

Challenges for the Next 100 Days of the COVID19 Pandemic

2030 the film preview

Why I wanted to Reawaken FM-2030’s Vision of the Future for 21st Century Audiences