I think it, or something like it, will happen within 30 years or so.
The Singularity is the idea that technological progress is accelerating and that soon, progress will be so fast and its effects so profound, that it will result in an irreversible change in what it means to be human; and also, things will be become unpredictable and incomprehensible to ordinary humans.
That's sort of the broad definition since different people think the Singularity will result in different things. For example, Ray Kurzweil makes specific predictions: humans will merge with machines and through time, more of our brains will become nonbiological components. Then ultimately, this AI will infiltrate the galaxy and all of the matter in the universe will become intelligent.
Maybe he's right, but it seems too much like romantic speculation. Rather, I think the better way to see the events leading to the Singularity as being caused by an intelligence explosion, rather than the simple advent of AI that is as intelligent as humans. I think within five years there's going to be a cognitive revolution akin to the revolution that Crick and Watson had with DNA. We're going to finally get a real grasp on how memory and intelligence works. When that happens, we'll find ways to greatly augment our own intelligence and memory with drugs, gene therapy, and brain implants. We'll see an intelligence explosion in ourselves before it happens with machines.
Then, spurred by our augmented intelligence, progress will be sped up even more greatly. As the impact of each successive technological paradigm becomes more disruptive, there will be a greater potential for disaster. So we might accidentally destroy ourselves before the Singularity even gets to happen.
b.) If not, what do you make of the fact that so many well-known and successful technology businesspeople and organizations take the Singularity seriously? Larry Page and Sergey Brin of Google, NASA, Bill Gates, Ray Kurzweil and the executives of Intel are all people and organizations who are sober, intelligent, and have privileged insight into technology. They all agree that this is where we are going.
The Singularity is the idea that technological progress is accelerating and that soon, progress will be so fast and its effects so profound, that it will result in an irreversible change in what it means to be human; and also, things will be become unpredictable and incomprehensible to ordinary humans.
That's sort of the broad definition since different people think the Singularity will result in different things. For example, Ray Kurzweil makes specific predictions: humans will merge with machines and through time, more of our brains will become nonbiological components. Then ultimately, this AI will infiltrate the galaxy and all of the matter in the universe will become intelligent.
Maybe he's right, but it seems too much like romantic speculation. Rather, I think the better way to see the events leading to the Singularity as being caused by an intelligence explosion, rather than the simple advent of AI that is as intelligent as humans. I think within five years there's going to be a cognitive revolution akin to the revolution that Crick and Watson had with DNA. We're going to finally get a real grasp on how memory and intelligence works. When that happens, we'll find ways to greatly augment our own intelligence and memory with drugs, gene therapy, and brain implants. We'll see an intelligence explosion in ourselves before it happens with machines.
Then, spurred by our augmented intelligence, progress will be sped up even more greatly. As the impact of each successive technological paradigm becomes more disruptive, there will be a greater potential for disaster. So we might accidentally destroy ourselves before the Singularity even gets to happen.
- I'd say there's a 20% chance that we'll be an extinct species in 30 years.
- Within 30 years, there's a 10% chance that something calamitous will happen, which will cause us to land in a fallen state, causing us to start civilization over again, like a post-apocalyptic movie.
- There's a 70% chance the Singularity, or something like it, will happen.
b.) If not, what do you make of the fact that so many well-known and successful technology businesspeople and organizations take the Singularity seriously? Larry Page and Sergey Brin of Google, NASA, Bill Gates, Ray Kurzweil and the executives of Intel are all people and organizations who are sober, intelligent, and have privileged insight into technology. They all agree that this is where we are going.