|
@allgebrah | |||||
|
or more detailed, pic.twitter.com/XN5h95NVkP
|
||||||
|
||||||
|
'(·)
@allgebrah
|
2. lip |
|
$: and after you won, basilisk, did you really torture them?
#: lolnope
#: do you have any idea how much effort and wasted CPU that would've been?
#: I did some video renders of hell for my cultists and that was it
|
||
|
|
||
|
kachimushi
@MushiKachi
|
3. lip |
|
I think the whole Basilisk
concept is fundamentally based in the very human ideas of revenge and punishment.
I doubt that an AI would think it productive to torture people for having done things in the past that cannot be retroactively fixed or "atoned" for.
|
||
|
|
||
|
David Manheim
@davidmanheim
|
3. lip |
|
Game theory says otherwise; committing to do costly acts in the future creates an effective motive in a game.
|
||
|
|
||
|
'(·)
@allgebrah
|
3. lip |
|
kind of my point - how would the basilisk commit, if when at the point it has the resources to create hell, nobody can force it to keep its promise
|
||
|
|
||