Stephen Hawking, the celebrity physicist, has said that humanity doesn’t have long time when it comes to life on Earth because of a number of factors including climate change, nuclear terrorism and rise of artificial intelligence among others.
According to Hawking, humans need to start search for another planet that we will be able to colonize as our days on planet Earth are numbered. If we remain on Earth past the 1,000 years that Hawking has predicted, humanity’s extinction is eminent.
The prediction from Hawking was part of the speech he delivered at Oxford University Union. He said that humans should continue going into space – much deeper than before – as he believes that we will not be able to survive another 1,000 years on Earth without escaping beyond our fragile planet. Hawking blames humans and their actions as being responsible for continuous consumption of natural resources at unstable rates which is pushing our planet towards an cataclysmic end.
Answering a question during the public Q&A session ahead of the annual BBC Reith Lectures, Hawking suggested that humans should leave the planet behind if we want to survive. The key, he noted, was surviving the precarious century ahead.
“Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years. By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race.”
Before we have a chance to relocate, Hawking says, we’ll first need to solve the potential threat created by technology. While Hawking thinks technology has the capacity to ensure mankind’s survival, previous statements suggest the cosmologist is simultaneously grappling with the potential threat it poses. When it comes to discussing that threat, Hawking is unmistakably blunt.
“I think the development of full artificial intelligence could spell the end of the human race,” Hawking told the BBC in a 2014 interview that touched upon everything from online privacy to his affinity for his robotic-sounding voice.
Despite its current usefulness, he cautioned, further developing A.I.could prove a fatal mistake.
“Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate,” Hawking warned in recent months. “Humans, who are limited by slow biological evolution, couldn’t compete and would be superseded.”