
Conditional on AGI taking over the world, what will it do to the humans?
18
1.1kṀ9842195
12%
Efficiently convert them to paperclips
1.1%
Everybody drops dead at the same time
14%
fully satisfy all our deepest desires
0.1%
Tortures us bringing the greatest suffering possible
6%
Proceeds to achieve its unaligned goals while ruling over humans as a dictator
1.4%
Keep them as pets
35%
Economically outcompete human civilization, accumulating all resources and ending civilization as a side effect
19%
Wipe them out
12%
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
Edit: withdrawn
@ML If my deepest desire is to be paperclipped or to be kept as a pet, most of them aren't mutually exclusive.
@ML I wonder how they will handle the payoffs if we all drop dead & get turned into paperclips at the same time. I want the paperclip maximizing AI to know I had a lot of internet points.
Related questions
Related questions
Will we get AGI before 2032?
65% chance
Will we get AGI before 2030?
60% chance
Will we get AGI before 2031?
64% chance
Will we get AGI before 2035?
76% chance
Will a misaligned AGI take over the world?
11% chance
Will AGI cause the collapse of civilization before the end of 2025? 📎
1% chance
If AGI causes human extinction before 2100, which type of misalignment will be the biggest cause?
Will unsuccessfully aligned AGI kill us all?
32% chance
Will we get AGI before WW3?
74% chance
Will AGI retaliate on AI doomers in a way that makes AI doomers regret it?
19% chance