This essay is in response to
The End Is A.I.: The Singularity Is Sci-Fi’s Faith-Based Initiative.
In futurist circles there’s the idea of the Singularity, a point in time when artificial intelligence grows so intelligent that it can make itself even more intelligent, causing it to rapidly surpass human intelligence.
The term “singularity” is inherently misleading.
In physics a singularity is where the predictive models break down, such as what happens inside a black hole. But everything about technology is unpredictable, and it’s only getting more so.
In biology they have a term related to this: “exaptation”: the process by which features acquire functions for which they were not originally adapted or selected.
An example is bird feathers, first used for insulation, then gliding, then flying. You can’t predict that up front.
You couldn’t have predicted that inventing Facebook in 2004 would lead to the societal uprisings of the Arab Spring in 2011. There are too many factors.
Another serious issue is that humans always merge with technology. To consider that in 2029 “computers will match and exceed human intelligence” is to ignore that humans are technology.
From arrows to cars to Google to smartphones, we seamlessly add technology to our lives. “Artificial General Intelligence” is a silly term, for just as there is no true artificialness about a bird building a nest, or humans building a house, there is no artificialness about building intelligent machines.
We are intelligent machines, and we will not be wiped out by AGI because we already are AGI.