Next generation arms race could cause 'extinction' event akin to nuclear war, pandemic: tech chief
Artificial Intelligence should be treated like nuclear weapons, executive director of an AI safety firm says
{{#rendered}} {{/rendered}}
An artificial intelligence arms race between countries and corporations to see who can develop the most powerful AI machines could create an existential threat to humanity, the co-founder of an AI safety nonprofit told Fox News.
"AI could pose the risk of extinction, and part of the reason for this is because we're currently locked in an AI arms race," Center for AI Safety Executive Director Dan Hendrycks said. "We're building increasingly powerful technologies, and we don't know how to completely control them or understand them."
"We did the same with nuclear weapons," he continued. "We're all in the same boat with respect to existential risk and the risk of extinction."
{{#rendered}} {{/rendered}}
AI ARMS RACE COULD LEAD TO EXTINCTION-LEVEL EVENT FOR HUMANITY: AI SAFETY DIRECTOR
WATCH MORE FOX NEWS DIGITAL ORIGINALS HERE
Hendrycks' firm released a statement Tuesday warning that "[m]itigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war." Many top AI researchers, developers and executives such as OpenAI CEO Sam Altman and "the Godfather of AI," Geoffrey Hinton, signed the statement.
Altman recently advocated for the government to regulate AI in testimony before Congress "to mitigate" risks the technology poses.
{{#rendered}} {{/rendered}}
"I'm concerned about AI development being a relatively uncontrolled process, and the AIs end up getting more influence in society because they're so good at automating things," Hendrycks, who also signed his organization's statement, told Fox News. "They are competing with each other and there's this ecosystem of agents that are running a lot of the operations, and we might lose control of that process."
"That could make us like a second-class species or go the way of the Neanderthals," he continued.
MILLIONS OF FAST FOOD WORKERS COULD LOSE THEIR JOBS WITHIN 5 YEARS. HERE'S WHY
{{#rendered}} {{/rendered}}
Tesla CEO Elon Musk has been outspoken about potential AI threats, saying the technology could lead to "civilizational destruction" or election interference. Musk also signed a letter in March advocating for the pause of large AI experiments.
However, the letter failed to prompt large AI developers such as OpenAI, Microsoft and Google to suspended experiments.
"We're having an AI arms race that could potentially bring us to the brink of catastrophe as the nuclear arms race did," Hendrycks said. "So that means we need a global prioritization of this issue."
{{#rendered}} {{/rendered}}
CLICK HERE TO GET THE FOX NEWS APP
But the organizations that create the world's most powerful AI systems don't have incentives to slow or pause developments, Hendrycks warned. The Center for AI Safety hopes its statement will inform people that AI poses a credible and important risk.
"Now hopefully we can get the conversation started so that it can be addressed like those other global priorities, like international agreements or regulation," Hendrycks told Fox News. "We need to treat this as a larger priority, a social priority and a technical priority, to reduce these risks."
{{#rendered}} {{/rendered}}
To watch the full interview with Hendrycks, click here.