Humans must leave Earth in the next 200 years if we want to survive.
That was the stark warning issued by Professor Stephen Hawking in the months before his death today at the age of 76.
The legendary physicists believed that life on Earth could be wiped out by a disaster such as an asteroid strike, AI or an alien invasion.
He also warned over-population, human aggression and climate change could cause humanity to self-destruct.
He believed, if our species had any hope of survival, future generations would need to forge a new life in space.
Humans must leave Earth within 200 years if we want to survive. That was the stark warning issued by Professor Stephen Hawking in the months before he death today at the age of 76
One of Hawking’s main fears for the planet was global warming.
‘Our physical resources are being drained, at an alarming rate. We have given our planet the disastrous gift of climate change,’ Hawking warned in July.
‘Rising temperatures, reduction of the polar ice caps, deforestation, and decimation of animal species. We can be an ignorant, unthinking lot.’
Hawking said that Earth will one day look like the 460°C (860°F) planet Venus if we don’t cut greenhouse gas emissions.
Hawking said that Earth (stock image) will one day look like the 460°C (860°F) planet Venus if we don’t cut greenhouse gas emissions
‘Next time you meet a climate change denier, tell them to take a trip to Venus. I will pay the fare,’ Hawking quipped.
The physicist also believed President Donald Trump’s decision to withdraw from the Paris Climate Agreement has doomed our planet.
He warned Trump’s decision would caused avoidable damage to our ‘beautiful planet’ for generations to come.
‘We are close to the tipping point where global warming becomes irreversible,’ the celebrated scientist told BBC last year.
If global warming doesn’t wipe us out, Hawking believed Earth would be destroyed by an asteroid strike.
‘This is not science fiction. It is guaranteed by the laws of physics and probability,’ he said.
‘To stay risks being annihilated.
‘Spreading out into space will completely change the future of humanity. It may also determine whether we have any future at all.’
Hawking was working with Russian billionaire Yuri Milner’s Breakthrough Starshot project to send a fleet of tiny ‘nanocraft’ carrying light sails on a four light-year journey to Alpha Centauri, the nearest star system to Earth.
‘If we succeed we will send a probe to Alpha Centauri within the lifetime of some of you alive today,’ he said.
Astronomers estimate that there is a reasonable chance of an Earth-like planet existing in the ‘habitable zones’ of Alpha Centauri’s three-star system.
If global warming doesn’t wipe us out, Hawking believed Earth would be destroyed by an asteroid strike (stock image)
‘It is clear we are entering a new space age. We are standing at the threshold of a new era’, said Hawking.
‘Human colonisation and other planets is no longer science fiction, it can be science fact.’
Hawking believed that In the long run the human race should not have all its eggs in one basket, or on one planet.
‘I just hope we can avoid dropping the basket until then’, he said.
AI could replace humans
Hawking claimed that AI will soon reach a level where it will be a ‘new form of life that will outperform humans.’
He even went so far as to say that AI may replace humans altogether, although he didn’t specify a timeline for his predictions.
The chilling comments during a recent interview with Wired.
He said: ‘The genie is out of the bottle. I fear that AI may replace humans altogether.
Hawking even went so far as to say that AI may replace humans altogether, although he didn’t specify a timeline for his predictions
‘If people design computer viruses, someone will design AI that improves and replicates itself.
‘This will be a new form of life that outperforms humans.’
He also he said the AI apocalypse was impending and ‘some form of government’ would be needed to control the technology.
During the interview, Hawking also urged more people to take an interest in science, claiming that there would be ‘serious consequences’ if this didn’t happen.