We must prepare for superintelligent computers


HUMANS have never encountered a more intelligent life form, but this will change if we create machines that greatly surpass our cognitive abilities. Then our fate will depend on the will of such a "superintelligence", much as the fate of gorillas today depends more on what we do than on gorillas themselves.


We therefore have reason to be curious about what these superintelligences will want. Is there a way to engineer their motivation systems so that their preferences will coincide with ours? And supposing a superintelligence starts out human-friendly, is there some way to guarantee that it will remain benevolent even as it creates ever more capable successor-versions of itself?


These questions – which are perhaps the most momentous that our species will ever confront – call for a new science of advanced artificial agents. Most of the work answering these questions remains to be done, yet over ...


To continue reading this article, subscribe to receive access to all of newscientist.com, including 20 years of archive content.