This AI Safety Summit Is a Doomer’s Paradise

Photo: Victor Moussa (Shutterstock)

Leaders and policymakers from around the globe will gather in London next week for the world’s first artificial intelligence safety summit. Anyone hoping for a practical discussion of near-term AI harms and risks will likely be disappointed. A new discussion paper released ahead of the summit this week gives a little taste of what to expect, and it’s filled with bangers. We’re talking about AI-made bioweapons, cyberattacks, and even a manipulative evil AI love interest.

The 45-page paper, titled “Capabilities and risks from frontier AI,” gives a relatively straightforward summary of what current generative AI models can and can’t do. Where the report starts to go off the deep end, however, is when it begins speculating about future, more powerful systems, which it dubs “frontier AI.” The paper warns of some of the most dystopian AI disasters, including the possibility humanity could lose control of “misaligned” AI systems.

Some AI risk experts entertain this possibility, but others have pushed back against glamorizing more speculative doomer scenarios, arguing that doing so could detract from more pressing near-term harms. Critics have similarly argued the summit seems too focused on existential problems and not enough on more realistic threats.

Britain’s Prime Minister Rishi Sunak echoed his concerns about potentially dangerous misaligned AI during a speech on Thursday.

“In the most unlikely but extreme cases, there is even the risk that humanity could lose control of AI completely through the kind of AI sometimes referred to as super intelligence,” Sunak said, according to CNBC. Looking to the future, Sunak said he wants to establish a “truly global expert panel,” nominated by countries attending the summit to publish a major AI report.

READ MORE  Jeannie Mai Denies Jeezy's Gatekeeping Parenting Claim, Cites Safety Concerns

But don’t take our word for it. Continue reading to see some of the disaster-laden AI predictions mentioned in the report.

Leave a Comment