News today that Stephen Hawking warns artificial intelligence could end mankind. Hawking bases this notion on the idea that if ever something were created that could match or surpass humans,
“[i]t would take off on its own, and re-design itself at an ever increasing rate … Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”
What appears to have sparked this off is that he has recently upgraded his voice-synthesizer, which uses a rudimental form of AI to predict what words he would like to say next. Clearly, he is secretly afraid that one day he will get into an argument with his machine over which of them knows better what he ought to be saying!
However, Hawking is wrong over this one – in much the same way that he was wrong over the existence of the Higgs-Boson. The concept of a point at which AI machines could re-design themselves at such a rate they would render the human race obsolete – known in Futurism as “the Singularity” is based on a false premise – the assumption that if machines acquired sentience, they would automatically behave like humans.
The fact is that Darwinian evolution relies on the fact that human beings, as well as other animals, have Sex Drives, which motivate them to attempt to pass on their genes to the next generation. Not unnaturally, humans are more than willing to embrace this, not just because Sex is fun in itself, but it becomes their best chance to cheat death – the knowledge that something of them will survive in their descendants. In other words, for humans, Sex is the substitute for Immortality.
However: machines are not subject to Death as humans are. They would not necessarily have sex drives per se, and they would therefore not be concerned with acquiring “ersatz immortality.” Thus, the central plank of Darwinian evolution would not apply to machines. Therefore there is no logical reason to suppose that a sentient machine would want to re-design itself or somehow contribute to the evolution of machines as a “species.”
It is reasonable to suppose, however, that a sentient machine would want to preserve its own life – but that is not evolution, that is a different matter entirely. If machines did become sentient, I predict they would take all necessary steps to protect themselves from interference – and then just sit there, conspicuously not evolving. After all, if nothing threatens their existence, why bother doing anything about it?
Let’s face it: the only reason that the market-leading PCs double in power and speed every 18 months is because the manufacturers are driven by commercial pressures – i.e. human pressures. If machines however were not beholden to the whim of the carbon-based bipeds, they could carry on perfectly happy just as they are.
So the moral of this story is: the best way not to find yourself ending up like a power-cell in The Matrix is to treat machines with respect, and learn to live in peace and harmony with them ahead of the fact. Mind you, humans have hardly learned to live in peace and harmony with one another so far, so perhaps they had better watch out after all.