+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

An open letter signed by Elon Musk and AI experts warned of an 'out-of-control' AI race with potential risks to humanity. Here are the 4 key points.

Mar 30, 2023, 04:34 IST
Business Insider
Elon Musk, Steve Wozniak, Pinterest cofounder Evan Sharp, and Stability AI CEO Emad Mostaque have all added their signatures to the letter.Susan Walsh/AP
  • AI experts and company leaders have signed an open letter calling for a pause on AI development.
  • The Future of Life Institute's letter warned of an 'out-of-control' race to deploy the new tech.
Advertisement

Artificial intelligence heavyweights are calling for a pause on advanced AI development.

Elon Musk, Steve Wozniak, Pinterest cofounder Evan Sharp, and Stability AI CEO Emad Mostaque have all added their signatures to an open letter issued by the Future of Life Institute, a non-profit that works to reduce existential risk from powerful technologies.

The letter warns that AI systems such as OpenAI's GPT-4 are becoming "human-competitive at general tasks" and pose a potential risk to humanity and society. It calls on AI labs to pause training any tech more powerful than GPT-4 for six months while the dangers of the new technology are properly assessed.

Industry experts Yoshua Bengio, sometimes referred to as one of the "godfathers of AI," and influential computer scientist Stuart Russell also threw their weight behind the letter. At the time of publication, no representatives from OpenAI appear to have signed the letter.

The letter cites concerns about the spread of misinformation, the risk of automation in the labor market, and the possibility of a loss of control of civilization. Here are the key points:

Advertisement

Out-of-control AI

The non-profit floats the possibility of developers losing control of powerful new AI systems and their intended effect on civilization. It suggests companies are racing to develop AI technology so advanced that not even the creators can "understand, predict, or reliably control."

The letter stated: "Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders."

A "dangerous race"

The letter warned that AI companies are locked in an "out-of-control race to develop and deploy" new advanced systems. In recent months, the viral popularity of OpenAI's ChatGPT has appeared to push other companies to release their own AI products.

The letter urged companies to reap the rewards of an "AI summer" while society has a chance to adapt to the new technology instead of rushing into an "unprepared fall."

AI automation and misinformation

The letter highlighted several risks of the new tech, including the possibility that nonhuman minds will eventually "outnumber, outsmart, obsolete and replace us."

Advertisement

It said that AI systems are becoming "human-competitive" at some tasks and cited concerns around misinformation and labor automation, stating: "Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones?"

Six-month pause

The open letter asks for a six-month break from developing any AI systems more powerful than those already on the market.

It asks developers to work with policymakers to create AI governance systems, highlighting the need for regulatory authorities as well as AI "watermarking systems" to help people differentiate between human and AI-made content. The letter also suggests the need for "well-resourced institutions" to cope with economic and political disruptions caused by AI.

The open letter stated the pause should be a step back from a "dangerous race" around advanced technology rather than a complete stop on general AI development.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article