Google DeepMind Co-Founder Predicts AGI By 2028

Above: Shane Legg attends "The Thinking Game" premiere during the 2024 Tribeca Festival at BMCC Theater on June 7, 2024, in New York City. Image copyright: Theo Wargo/Staff/Getty Images Entertainment via Getty Images for Tribeca Festival

The Facts

  • In a Jan. 10 post on X, Shane Legg, the co-founder of Google DeepMind, maintained his 2011 claim that there's a 50% chance of developing human-level artificial intelligence (AI), known as artificial general intelligence (AGI), by 2028.

  • Subsequent social media discussions pointed out that Legg's prediction includes a 5% to 50% chance of human extinction within one year after AGI development.

  • However, when asked when he thought "superintelligence" would be developed, he said he "thinks this is currently one of the most important questions" and that "Investigating this is one of [his] top priorities.


The Spin

Narrative A

Whether these apocalyptic predictions come true or not, it's not in anyone's best interest to allow tech executives to control the future of AGI development. If achieved, the advent of smarter-than-human intelligence will bring severe consequences to most people with no say in its creation. The world, not just tech companies, should have a say in how these technological endeavors proceed.


Narrative B

Those sounding the alarm over AGI are important voices in this debate, but their doom-and-gloom predictions should not overwhelm public discourse. AGI has the ability to both harm and help the world, which is why these technologies should be developed quickly by well-intentioned developers before bad actors have a chance to. The sooner these technologies are developed, the sooner they can be studied and tweaked to achieve optimal outcomes.


Narrative C

The problem with listening to high-level AI executives is that they have a history of flip-flopping on this issue. There are many potential reasons for this, including using hyperbolic language to promote their products or as a PR stunt aimed at keeping the government from regulating them. The most important thing the public can do is listen to invested and independent voices and judge them on the merit of their claims.



Metaculus Prediction





Go Deeper


Articles on this story