Conversation

You get that he believes (unaligned) AGI would wipe out humanity, right? Would you disagree that (some survivors) is better than (no survivors)?
12
39
The part where he, Eliezer Yudkowsky, is so incredibly overconfident in his own projections of AI doom that he's willing to kill the cast majority of people on Earth is what's *extremely* not okay. (Which probably wouldn't even prevent future generations from building AGI!)
3
54
Show replies
No. But people seem to have misread what he wrote to think so.
Quote Tweet
Oh, for fuck's sake. I'll say it more plainly. I did not propose first use of nuclear weapons, by anyone, on anything. If anyone tells you I said otherwise, mark them down for intellectual dishonesty, lack of seriousness, and grossly misrepresenting someone else's position. twitter.com/ESYudkowsky/st…
1
3
This is what you get when people on forums think that thought experiments hold up at high energies. I am reminded of the "one in billions chance this isn't a simulation" stuff. Why aren't they trying to please the ASI running the simulation?
1
12