This week, thousands of TikTok users were whipped into an apocalyptic frenzy as viral predictions of the ‘Rapture’ spread online.
However, rather embarrassingly for the preachers who predicted it, the supposed End of Days has now come and gone without incident.
Now, experts have revealed what the apocalypse will really look like.
And the bleak reality of human extinction is far more depressing than any story of Biblical annihilation.
From the deadly threat of rogue AI or nuclear war to the pressing risk of engineered bio–weapons, humans themselves are creating the biggest risks to our own survival.
Dr Thomas Moynihan, a researcher at Cambridge University‘s Centre for the Study of Existential Risk, told Daily Mail: ‘Apocalypse is an old idea, which can be traced to religion, but extinction is a surprisingly modern one, resting on scientific knowledge about nature.
‘When we talk about extinction, we are imagining the human species disappearing and the rest of the universe indefinitely persisting, in its vastness, without us.
‘This is very different from what Christians imagine when they talk about Rapture or Judgement Day.’

While TikTok evangelists predicted the rapture would come this week, apocalypse experts say that human life is much more likely to be destroyed by our own actions than any outside force – such as nuclear war (AI–generated impression)
Nuclear war
Scientists who study the destruction of humanity talk about what they call ‘existential risks’ – threats that could wipe out the human species.
Ever since humans learned to split the atom, one of the most pressing existential risks has been nuclear war.
During the Cold War, fears of nuclear war were so high that governments around the world were seriously planning for life after the total annihilation of society.
The risk posed by nuclear war dropped after the fall of the Soviet Union, but experts now think the threat is spiking.
Earlier this year, the Bulletin of the Atomic Scientists moved the Doomsday Clock one second closer to midnight, citing an increased risk of a nuclear exchange.
The nine countries which possess nuclear arms hold a total of 12,331 warheads, with Russia alone holding enough bombs to destroy seven per cent of urban land worldwide.
However, the worrying prospect is that humanity could actually be wiped out by only a tiny fraction of these weapons.

The nine nations with nuclear weapons currently hold 12,331 nuclear warheads, which could lead to millions of deaths (AI–generated impression)
Dr Moynihan says: ‘Newer research shows that even a relatively regional nuclear exchange could lead to worldwide climate fallout.
‘Debris from fires in city centres would loom into the stratosphere, where it would dim sunlight, causing crop failures.
‘Something similar led to the demise of the dinosaurs, though that was caused by an asteroid strike.’
Studies have shown that a so–called ‘nuclear winter’ would actually be far worse than Cold War predictions suggested.
Using modern climate models, researchers have shown that a nuclear exchange would plunge the planet into a ‘nuclear little ice age’ lasting thousands of years.
Reduced sunlight would plunge global temperatures by up to 10˚C (18˚F) for nearly a decade, devastating the world’s agricultural production.
Modelling suggests that a small nuclear exchange between India and Pakistan would deprive 2.5 billion people of food for at least two years.
Meanwhile, a global nuclear war would kill 360 million civilians immediately and lead to the starvation of 5.3 billion people in just two years following the first explosion.

Even a limited nuclear exchange could plunge the world into a ‘little nuclear ice age’ which would drop global temperatures by 10°C (18°F) for thousands of years (AI–generated impression)
Dr Moynihan says: ‘Some argue it’s hard to draw a clear line from this to the eradication of all humans, everywhere, but we don’t want to find out.’
Engineered bioweapons
Just like the threat of nuclear arms, another likely way that humanity could come to an end is through the release of an engineered bioweapon.
Since 1973, when scientists created the first genetically modified bacteria, humanity has been steadily increasing its capacity to make deadly diseases.
These man–made diseases pose a significantly greater threat to our existence than anything found in nature.
Otto Barten, founder of the Existential Risk Observatory, told the Daily Mail: ‘We have a lot of experience with natural pandemics, and these have not led to human extinction in the last 300,000 years.
‘Therefore, although natural pandemics remain a very serious risk, this is very likely not going to cause our complete demise.
‘However, man–made pandemics might be engineered specifically to maximise effectiveness, in a way that doesn’t occur in nature.’

Natural pandemics are unlikely to lead to human extinction, but genetically engineered variants could be much more deadly (AI–generated impression)

Experts are concerned that the tools needed to engineer deadly pathogens are becoming more accessible and could fall into the wrong hands (AI–generated impression)
Currently, the means to create such deadly diseases are limited to a handful of states that wouldn’t benefit from unleashing a deadly plague.
However, scientists have warned that improving technologies like AI mean that this ability is likely to fall into the hands of more and more people.
If terrorists gain the ability to create deadly bioweapons, they could release a pathogen that would spread wildly out of control and eventually lead to humanity’s extinction.
What would be left behind would be a world that looks like it does now, but with all traces of living humans wiped away.
Dr Moynihan adds: ‘Extinction is, in this way, the total frustration of any kind of moral order; again, within a universe that persists, silently, without us.’
Rogue artificial intelligence
Experts currently believe that the biggest danger humanity is creating for itself is artificial intelligence.
Scientists who study existential risk think there is anywhere between a 10 and 90 per cent chance that humanity will not survive the advent of superintelligent AI.

One of the biggest risks to humanity is the creation of a rogue AI which becomes ‘unaligned’ with humanity’s interests (AI–generated impression)
The big concern is that a sufficiently intelligent AI will become ‘unaligned’, meaning its goals and ambitions will cease to line up with the interests of humanity.
Dr Moynihan says: ‘If an AI becomes smarter than us and also becomes agential — that is, capable of conjuring its own goals and acting on them.’
If an AI becomes agentic, it doesn’t even need to be openly hostile to humans for it to wipe us out.
When an agentic AI has a goal that differs from what humans want, the AI would naturally see humans turning it off as a hindrance to that goal and do everything it can to prevent that.
The AI might be totally indifferent to humans, but simply decides that the resources and systems that keep humanity alive would be better used pursuing its own ambitions.
Experts don’t know exactly what those goals might be or how the AI might try to pursue them, which is exactly what makes an unaligned AI so dangerous.
‘The problem is that it’s impossible to predict the actions of something immeasurably smarter than you,’ says Dr Moynihan.
‘It’s hard to imagine how we could anticipate, intercept, or prevent the AI’s plans to implement them.’

Experts aren’t sure how an AI would chose to wipe out humanity, which is what makes them so dangerous – but it could involved usurping our own computerised weapons or nuclear launch systems (AI–generated impression)
Another big issue is that experts don’t know exactly how an AI might go about wiping out humanity.
Some experts have suggested that an AI might take control of existing weapon systems or nuclear missiles, manipulate humans into carrying out its orders, or design its own bioweapons.
However, the scarier prospect is that AI might destroy us in a way we literally cannot conceive of.
Dr Moynihan says: ‘The general fear is that a smarter–than–human AI would be able to manipulate matter and energy with far more finesse than we can muster.
‘Drone strikes would have been incomprehensible to the earliest human farmers: the laws of physics haven’t changed in the meantime, just our comprehension of them.
‘Regardless, if something like this is possible, and ever does come to pass, it would probably unfold in ways far stranger than anyone currently imagines. It won’t involve metallic, humanoid robots with guns and glowing scarlet eyes.’
Climate change
Mr Barten says: ‘Climate change is also an existential risk, meaning it could lead to the complete annihilation of humanity, but experts believe this has less than a one in a thousand chance of happening.’

In an unlikely but terrifying scenario, a runaway greenhouse effect could cause all water on Earth to evaporate and escape into space, leaving the planet dry and barren (AI–generated impression)
However, there are a few unlikely scenarios in which climate change could lead to human extinction.
For example, if the world becomes hot enough, large amounts of water vapour could escape into the upper atmosphere in a phenomenon known as the moist greenhouse effect.
There, intense solar radiation would break the water down into oxygen and hydrogen, which is light enough to easily escape into space.
At the same time, water vapour in the atmosphere would weaken the mechanisms which usually prevent gases from escaping.
This would lead to a runaway cycle in which all water on Earth escapes into space, leaving the planet dry and totally uninhabitable.
The good news is that, although climate change is making our climate hotter, the moist greenhouse effect won’t kick in unless the climate gets much hotter than scientists currently predict.
The bad news is that the moist greenhouse effect will almost certainly occur in about 1.5 billion years when the sun starts to expand.
This article was originally published by a www.dailymail.co.uk . Read the Original article here. .