|
@FLIxrisk | |||||
|
In this newest episode of the AI Alignment Podcast, Stuart Armstrong discusses his recently published research agenda on how to identify and synthesize human values into a utility function. futureoflife.org/2019/09/17/syn…
|
||||||
|
||||||
|
@Sentientism
@sentientism
|
19. ruj |
|
#sentientism may help with the “who and what to care about - and how much to care” problem.
Sentience is the morally salient component of consciousness.
I’m hoping AGIs decide to be #sentientist - for our own sakes.
Read: secularhumanism.org/2019/04/humani…
|
||
|
|
||