AI Predicts the Last Selfies Taken on Earth Before the Planet’s Collapse

The singular relationship between humanity and artificial intelligence

Eugenio De Lucchi
4 min readSep 13, 2022


Photo by Natalya Letunova on Unsplash

Humanity’s relationship with artificial intelligence is singular. Our deepest fear is that a higher form of intelligence could end our species. But at the same time, it is our deepest hope for accelerating progress and solving science’s biggest challenges.

Decades ago, we asked artificial intelligence to predict the end of our civilization. Today, we ask it to imagine and depict the end of our species.

Back in 1972, for the first time in history, a team of MIT researchers built a machine learning model to anticipate the collapse of modern civilization.

The program predicted the collapse of modern society by 2040, charting a trajectory based on population, pollution, and natural resource use.

From another perspective, artificial intelligence has been our first fear as a threat to our species. Stephen Hawking first, Elon Musk later, as well as numerous experts, have warned us of the risks that an intelligence superior to our own could pose.

All of these voices align with the perspective of Nick Bostrom, author of the bestseller Superintelligence, which has received endorsements from Elon Musk and other prominent people in Tech.

Bostrom has repeatedly argued the risks of extinction are little known and grossly underestimated. Artificial intelligence poses a threat worse than any precedent technology –if its development does not have proper precautions.

The existential risk posed by a general form of artificial intelligence is an irreparable catastrophe –at worst, extinction.

Bostrom’s fear, as well as those who share his views, is evolutionary. The most intelligent species rules the world; with an explosion of Intelligence, AI would gain the ability to improve itself in a short time, surpassing human intelligence potential by many orders of magnitude.

And should AI achieve “superintelligence,” surpassing humans in intelligence, then it could become impossible to control.



Eugenio De Lucchi