MoreBack to News Headlines

AI could pose "risk of extinction" akin to nuclear war and pandemics, experts say
CBSN
Artificial intelligence could pose a "risk of extinction" to humanity on the scale of nuclear war or pandemics, and mitigating that risk should be a "global priority," according to an open letter signed by AI leaders such as Sam Altman of OpenAI as well as Geoffrey Hinton, known as the "godfather" of AI.
The one-sentence open letter, issued by the nonprofit Center for AI Safety, is both brief and ominous, without extrapolating how the more than 300 signees foresee AI developing into an existential threat to humanity.
In an email to CBS MoneyWatch, Dan Hendrycks, the director of the Center for AI Safety, wrote that there are "numerous pathways to societal-scale risks from AI."
More Related News