Stephen Hawking may be a theoretical physicist by profession but he has gained such a significant cult following over the years that he can comment on just about anything and people would actually listen.
Hawking's popularity was well on display during his recent interview with Larry King on the talk show Larry King Now on Saturday, June 25.
The director of Cambridge University's Theoretical Cosmology Center expressed his concern about the greediness and stupidity of man when it comes to how he has been treating the environment over time. He pointed out that human stupidity and pollution continues to be two of the biggest threats to humanity.
Hawking told King that he was already thinking about overcrowding and pollution six years ago, and that those two concerns of his have only become worse over the years.
He said that the population of the world has grown by as much as half a billion more since his last interview with King, and there doesn't seem to be any indication that it's going to stop anytime soon. At the rate the world is going, Hawking said there would be 11 billion people living on the planet by 2100.
The renowned scientist also drew attention to the current level of air pollution on Earth, which has increased by as much as 8 percent in the last five years. This means that more than 80 percent of people living in urban areas are breathing in dangerous levels of pollution in the air.
During the interview, Hawking was also asked about what he thinks is the biggest problem that humanity has to face. He said that climate change remains a particular concern that people have to resolve.
He told King that he was worried about the possibility that humanity has already reached the point of no return.
"Will we be too late to avoid dangerous levels of global warming?" the famed theoretical physicist asked.
Hawking and King also talked about the state of artificial intelligence development and how governments of the world appear to be engaged in a so-called "AI arms race," where beneficial technologies, such as better medical screening, seem to take a back seat to the creation of intelligent weapons.
The theoretical physicist warned that creating AI could be difficult to stop, especially if it went rogue on its creators. He explained that developers should make sure that they follow ethical standards when they create AI programs and to have safeguards in place in case anything goes wrong.