Wednesday, April 19, 2023

Pause Giant AI Experiments

I signed the open letter at Future of Life Institute, Pause Giant AI Experiments: An Open Letter

We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.

AI systems with human-competitive intelligence can pose profound risks to society and humanity, as shown by extensive research and acknowledged by top AI labs. As stated in the widely-endorsed Asilomar AI Principles, Advanced AI could represent a profound change in the history of life on Earth, and should be planned for and managed with commensurate care and resources. Unfortunately, this level of planning and management is not happening, even though recent months have seen AI labs locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control.

Summary of the Letter

  • Big computer programs that can understand and use human language have improved a lot, but they also have big risks. These risks include spreading wrong information, being used for bad purposes, and creating fake things that look very real.
  • People who make these programs need to be more honest and tell us more about what they are doing.
  • The letter asks for a break in making these programs until we understand them better.
  • We need more different types of research on how these programs affect our lives.
  • The people who make these programs need to think about how others might use them for bad things.
  • We should be careful about making these programs, even though they might help us do good things.

No comments:

Post a Comment

p(doom)

~~caption here~~       Chat history ChatGPT ChatGPT Coloring Book Hero DALL·E Today pdoom and AI Safety Previous 7 Days Dia de los Muert...