Paperclip maximizer

AI

The paperclip maximizer is a thought experiment that goes something like this:

"Imagine you task an AI with creating as many paperclips as possible. It could decide to turn all matters of the universe into paperclips, including humans. Since an AI may not value human life, and it only cares about maximizing paperclips, the risk to humans is great."

The paperclip maximizer is an example of instrumental convergence, a term that suggests an AI could seek to fulfil a harmless goal in harmful ways (from the human point of view). Harnessing a true AI to simple, seemingly innocent tasks could pose existential risk to humans if we don't safely know how to make a machine value human life.

Learn more:

Share this concept

See more concepts like this

There are many more concepts in my "Mind Expander" tool (it's free)

Check it out