GET IN TOUCH WITH PAKKO, CREATIVE DIRECTOR ALIGNED FOR THE FUTURE OF CREATIVITY.
PAKKO@PAKKO.ORG

LA | DUBAI | NY | CDMX

PLAY PC GAMES? ADD ME AS A FRIEND ON STEAM

 


Back to Top

Pakko De La Torre // Creative Director

Artificial Intelligence Expert Warns Letter Halting Future AI Systems for 6-Months Doesn’t Go Far Enough

Artificial Intelligence Expert Warns Letter Halting Future AI Systems for 6-Months Doesn’t Go Far Enough

ABOVE VIDEO: Leaders call for temporary halt of artificial intelligence development

(FOX NEWS) – An artificial intelligence expert with more than two decades of experience studying AI safety said an open letter calling for a six-month moratorium on developing powerful AI systems does not go far enough.

Eliezer Yudkowsky, a decision theorist at the Machine Intelligence Research Institute, wrote in a recent op-ed that the six-month “pause” on developing “AI systems more powerful than GPT-4” called for by Tesla CEO Elon Musk and hundreds of other innovators and experts understates the “seriousness of the situation.” He would go further, implementing a moratorium on new large AI learning models that is “indefinite and worldwide.”

The letter, issued by the Future of Life Institute and signed by more than 1,000 people, including Musk and Apple co-founder Steve Wozniak, argued that safety protocols need to be developed by independent overseers to guide the future of AI systems.

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable,” the letter said. Yudkowsky believes this is insufficient.

“The key issue is not ‘human-competitive’ intelligence (as the open letter puts it); it’s what happens after AI gets to smarter-than-human intelligence,” Yudkowsky wrote for Newsweek.

“Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die,” he asserts. “Not as in ‘maybe possibly some remote chance,’ but as in ‘that is the obvious thing that would happen.’”

For Yudkowsky, the problem is that an AI more intelligent than human beings might disobey its creators and would not care for human life. Do not think “Terminator” — “Visualize an entire alien civilization, thinking at millions of times human speeds, initially confined to computers — in a world of creatures that are, from its perspective, very stupid and very slow,” he writes.

CLICK HERE TO READ FULL ARTICLE ON FOX NEWS

The post Artificial Intelligence Expert Warns Letter Halting Future AI Systems for 6-Months Doesn’t Go Far Enough appeared first on Space Coast Daily.

This content was originally published here.