Twitter | Search | |
Eliezer Yudkowsky
Ours is the era of inadequate AI alignment theory. Any other facts about this era are relatively unimportant, but sometimes I tweet about them anyway.
3,445
Tweets
62
Following
25,749
Followers
Tweets
Eliezer Yudkowsky 37m
But on the rare plus side - and I think it must be very rare - sometimes a child believes in himself; and then he grows up, and finds out he *was* right in all the flaws he pointed out in his parents' religious claims, and this experience forever warps him, and he ends up as me.
Reply Retweet Like
Eliezer Yudkowsky 2h
Replying to @webdevMason
Unfortunately, yes. In retrospect I kinda wish I'd been slightly more wild back when I had more energy for it.
Reply Retweet Like
Eliezer Yudkowsky 8h
"We disabled all the microphones! The AI can't still be listening to us unless it has magic!"
Reply Retweet Like
Eliezer Yudkowsky 10h
Most people are fine with mockery, so long as it's somebody appropriately high in their true perceived status order mocking someone lower. Mockery isn't even perceived as such unless the direction of it violates the true status order. The order usually has kids as pretty low.
Reply Retweet Like
Eliezer Yudkowsky retweeted
Ben Blackwood 13h
Replying to @webdevMason
I noticed early on that adults think it’s fine to mock kids to their faces as long as the thing they’re mocking them for has to do with them being a child. Things like “aww he’s sad because he can’t have desert. He thinks that’s an actual problem. Wait til you get older.”
Reply Retweet Like
Eliezer Yudkowsky 10h
There's a Sequence about it.
Reply Retweet Like
Eliezer Yudkowsky 11h
Replying to @webdevMason
Variations on "intelligence isn't good for much" or "other things are more important than intelligence". I wasn't young enough to be fooled by that. But unfortunately, I was young enough to try to reverse the stupidity and believe that intelligence must be all that mattered.
Reply Retweet Like
Eliezer Yudkowsky 11h
Might be useful cognitive games: - 1-player game that requires the player to frequently, **explicitly** change their mind. - 2-player game that requires the player to often change their mind out loud, announcing "I change!" each time.
Reply Retweet Like
Eliezer Yudkowsky May 20
We don't have enough words, but "My System 1 feels like..." includes difficulty verbalizing plus the lack of complete endorsement of the position.
Reply Retweet Like
Eliezer Yudkowsky May 20
Replying to @GolerGkA @brainiac256
I mean I would've been writing her character differently since earlier, but say: Daenerys walks over to the throne, sits in it, and declares, "This is mine... and none shall have it after me," rises up, and orders her dragons to destroy it.
Reply Retweet Like
Eliezer Yudkowsky May 20
I wouldn't expect that error to start fooling politicians and prediction markets in 2016 after having not fooled them for years before that?
Reply Retweet Like
Eliezer Yudkowsky May 20
Replying to @paulg @whyvert
The prediction markets were also fooled, so the factor is something that market participants underestimated even after seeing Brexit and Trump.
Reply Retweet Like
Eliezer Yudkowsky May 20
Replying to @narayanarjun
It would be satisfactory if it resolved the 2,000 pages of foreshadowing I would've done beforehand.
Reply Retweet Like
Eliezer Yudkowsky May 20
Replying to @primalpoly
Running monetary policy is an *extremely* good use of a precognitive. That's why Scott Sumner keeps screaming that it needs a prediction market.
Reply Retweet Like
Eliezer Yudkowsky May 20
With a surprising yet carefully foreshadowed revelation about the White Walkers enabling their defeat, but only after they had ruined most of Westeros; and with Daenerys publicly melting the Iron Throne with dragonfire, and establishing the Dragon's Watch to outlaw future wars.
Reply Retweet Like
Eliezer Yudkowsky May 20
Replying to @ArthurB
It's possible he was trying to make a joke there.
Reply Retweet Like
Eliezer Yudkowsky May 20
"The generation of new tactics, strategies, coordination mechanisms, and so on entails the production of new, useful knowledge... For the tradition of knowledge to be living, it must have at least one theorist."
Reply Retweet Like
Eliezer Yudkowsky May 20
Probably one of the top 10 wrongest things ever said on YouTube, quantitatively speaking.
Reply Retweet Like
Eliezer Yudkowsky May 20
I don't think I'd appeal to that trope? I'm talking here about politicians having shared an error that previously fooled several prediction markets (though those will now update, one imagines), not about people with strong ideals (who might not be autoretargeting that much).
Reply Retweet Like
Eliezer Yudkowsky May 19
Replying to @gwern @anderssandberg
That's you and not the GPT, right? Right?!
Reply Retweet Like