Twitter | Pretraživanje | |
Future of Life
In this newest episode of the AI Alignment Podcast, Stuart Armstrong discusses his recently published research agenda on how to identify and synthesize human values into a utility function.
In this episode of the AI Alignment Podcast, Stuart Armstrong and Lucas Perry discuss Stuart's Research Agenda v0.9: Synthesizing a human's preferences into a utility function.
Future of Life Future of Life @FLIxrisk
Reply Retweet Označi sa "sviđa mi se" More
@Sentientism 19. ruj
Odgovor korisniku/ci @FLIxrisk
may help with the “who and what to care about - and how much to care” problem. Sentience is the morally salient component of consciousness. I’m hoping AGIs decide to be - for our own sakes. Read:
Reply Retweet Označi sa "sviđa mi se"