AI safety expert predicts a 99.999999% chance of p(doom). What's that mean? Well, it isn't good

Roman Yampolskiy, an expert in AI safety, disagrees with Elon Musk's recent assessment that there's only a 10-20% chance of AI killing off humankind.

1 minute & 54 seconds read time

Could AI spell doom for humankind eventually? Depending on which expert you talk to, the chances vary considerably, but one researcher definitely has a gloomy (and doomy) opinion - one that Elon Musk doesn't share.

Take Kid Bookie's advice and 'save yourself from Al' (Image Credit: Pixabay)

Take Kid Bookie's advice and 'save yourself from Al' (Image Credit: Pixabay)

Business Insider reported on revelations made at the recent Abundance Summit (held last month), which included a 'great AI debate' where Musk estimated the risk of AI ending humanity was "about 10% or 20% or something like that."

Obviously that's something akin to wild guesswork, but the general gist of the billionaire's philosophy is that we should push ahead with AI development as the probable positive outcomes outweigh any negative scenario.

With Musk's assessment, what isn't mentioned is that the negative scenario we're running the risk of is the annihilation of all humanity. Which does rather tip the scales heavily against better chatbots, perhaps.

At any rate, Musk's probability theorizing is definitely not shared by Roman Yampolskiy, director of the Cyber Security Laboratory at the University of Louisville, an expert in AI safety and author of books on the subject.

Yampolskiy expressed the opinion that Musk is being conservative and the chance of p(doom) or 'probability of doom' - in which an AI brings an end to humanity, or enslaves us, or a similar unthinkable scenario - is much higher than 20% or so.

So, what does the expert Russian scientist believe is the p(doom)? Well, you probably won't be comforted to learn that Yampolskiy pins a figure of 99.999999% on that probability. That is, of course, saying it's pretty much a certainty.

We're all p(doomed)

We're guessing here ourselves - like everyone when it comes to p(doom) - that Yampolskiy is more seeking to raise very serious concerns about AI development, rather than actually predicting the certain doom of humanity. But those concerns must run pretty deep to air such an estimation.

Yamploskiy's underlying philosophy is that because it'll be impossible to control a sufficiently advanced AI once it's realized, the best bet is to take action now, and ensure we don't make that kind of AI in the first place. We can certainly see where that view is coming from.

Predictions of AI spelling doom for us all range from a 5% chance to 50% chance, generally speaking, across tech bigwigs in Silicon Valley, the report tells us.

Even a 50-50 chance of an AI apocalypse isn't great, let's face it. If you were presented with a pill and told you have a 50% chance of becoming superhuman, or a 50% chance of instantly dying - would you take it? We think we'd pass, but there are others out there who probably wouldn't.

Buy at Amazon

ASUS Dual GeForce RTX 4070 OC Edition 12GB GDDR6X

TodayYesterday7 days ago30 days ago
Buy at Newegg
* Prices last scanned on 4/21/2024 at 5:40 am CDT - prices may not be accurate, click links above for the latest price. We may earn an affiliate commission.

Darren has written for numerous magazines and websites in the technology world for almost 30 years, including TechRadar, PC Gamer, Eurogamer, Computeractive, and many more. He worked on his first magazine (PC Home) long before Google and most of the rest of the web existed. In his spare time, he can be found gaming, going to the gym, and writing books (his debut novel – ‘I Know What You Did Last Supper’ – was published by Hachette UK in 2013).

What's in Darren's PC?

Newsletter Subscription

Related Tags