Should You Fear The Singularity?

When the Singularity happens… don’t panic. (HT to Douglas Adams).


The Singularity is a scary thing to many scientists and thinkers, and it may happen in your lifetime. There is no doubt the Singularity will change the course of humankind forever. It may even alter what being human means, and that sounds scary, but it might not be.

If you do not know what the Singularity is (but you have heard the phrase bouncing around), here is a quick primer…

As we build better, smarter machines, it is inevitable two things will happen:

1. Machines will reach a state of self-awareness (or a state indistinguishable from human consciousness). Artificial Intelligence will just become “intelligence”. This means a self-aware machine may be no more a machine than you are (after all, you and I are biomechanical machines). Intelligent, conscious machines may then begin to expect individual rights and equality with their human makers.

Conscious machines obviously lead to complications with our relationship to technology. One example is called “The Paperclip Problem”. If a machine that does nothing but build paperclips in a factory somehow attains enough information and computing power that it achieves basic consciousness (for example, through being connected to the internet), then it might start thinking. And it might put together a rudimentary logic problem, like this:

“Hey, I make paperclips. Paperclips are made of molecules. People are made of molecules. Why don’t I just make paperclips out of people?”

Scary stuff, right? Next, Arnold Schwarzenegger is sent back to the past to prevent John Connor from being born, but then has to go back again to save John Connor, and then goes back again to stop himself… and, well it gets complicated.

2. The other thing that will happen–the really important thing–is eventually we will build a machine smart enough to build a machine which is smarter than itself. That machine, then, will also be able to build a machine smarter than itself, ad infinitum. If machine technology and intelligence were plotted on a bar graph, you would watch a slow, steady incline throughout all of history to that point, and then the bar would rocket up in a straight line, off the chart.

When that happens, machines could surpass human intelligence in a matter of minutes, and within a few hours… well, no one knows what happens then, and that’s where it becomes scary for a lot of people.

Machines could think in ways we literally can not imagine, perhaps even deciding humanity is irrelevant, or worse, a disease that needs to be cured (enter Arnold Schwarzenegger, etc.). This is actually what a lot of theorists believe–that the Singularity will likely mean the swift end of humankind.

I am aligned with Phil Libin’s thoughts on the matter, though. Phil is the co-founder and CEO of Evernote, but also one of the many people involved in the discussion about what will happen when the Singularity occurs.

I wish I could have said it better but I have to steal Phil’s quote, which nearly perfectly encapsulates my feelings about the Singularity and why we should not worry too much about T-1000’s wiping out mankind.

“I don’t understand why the obviously smart thing to do is to kill all the humans,” Phil said. “I mean, the smarter I get, the less I want to kill all the humans! …What is it about our guilt, as a species, that makes us think the smart thing to do is to kill all the humans? …Can we maybe get to a point where we feel proud of our species and we say, maybe, the smart thing to do isn’t to wipe it out?”

If you think about it, the tendency for any species, as it grows more intelligent, is toward peace, not war. We see this in dolphins, chimpanzees, even in humans. As a species, we went from neanderthals wielding clubs and fire, to a sophisticated global community, filled with vegans, peace activists, environmentalists, and scientists.

I do not know what would happen if a new species underwent that transformation in a matter of seconds rather than over thousands of years, but my guess is the end-result would be similar. War, hatred, and extinction agendas are not logical for long-term survival, whether you are man or machine.