One of the world's most renowned physicists is concerned over the safety of living on Earth and he is urging humanity to make the necessary precautions.
Next 100 Years Could Be Most Dangerous For Inhabitants Of Planet Earth
Stephen Hawking, who has earlier said that the next 100 years could be the most dangerous for our planet, is now telling everyone to start making preparations as he fears nuclear war, global warming, and genetically engineered viruses, among others, pose potential threats to humanity.
The scientist appears firmer about calls to leave Earth before the year 2117. Hawking is in fact making a TV program through which he examines how humans can leave Earth to survive.
Hawking To Appear In BBC Documentary 'Expedition New Earth'
The BBC documentary that Hawking is making is Expedition New Earth, a part of the returning TV show Tomorrow's World, which had its good share of popularity in the last century.
It has been 14 years since the future-gazing series was canceled after being shown for 38 years on air. BBC and the scientists involved in the program promise that the new season will be better.
For the documentary, the scientist, along with his former student Christophe Galfard, is set to travel the world to find ways humans can start making preparations to live in outer space.
Colonies In Other Worlds
Hawking is reportedly to claim that humans need to colonize another planet over the next 100 years. The scientist earlier said that humanity may be able to survive if it can establish colonies on other worlds, although his previous belief on the matter was that he does not think it possible for humans to create self-sustaining colonies in space over the next 100 years.
"Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years," he said. "By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race."
Concerns Over Artificial Intelligence
Hawking earlier shared his perceived danger of artificial intelligence, citing that AI could become powerful enough to result in the extinction of the human race.
He said that while artificial intelligence may help eradicate disease and poverty, as well as do something about climate change, it may also lead to the creation of a range of unwanted and potentially dangerous things and events such as autonomous weapons, machines with a will of their own, and economic disruption, which are detrimental for the human race.
"I believe there is no deep difference between what can be achieved by a biological brain and what can be achieved by a computer. It therefore follows that computers can, in theory, emulate human intelligence - and exceed it," Hawking said. "In short, the rise of powerful AI will be either the best, or the worst thing, ever to happen to humanity. We do not yet know which."