|
@eriktorenberg | |||||
|
Conclusion from Don Hoffman's "Illusion of Reality":
An organism that sees reality as it truly is, is never more fit than an organism of equal complexity that sees none of reality and is just attuned to fitness payoffs.
In other words, seeing the truth will make you extinct.
|
||||||
|
||||||
|
Gabe Bassin
@gabebassin
|
20 h |
|
How is "reality as it truly is" defined here? To me, reality is purely subjective and things we view as "objective" are simply shared hallucinations.
|
||
|
|
||
|
Ahmed Medhat
@amedhat_
|
19 h |
|
The premise is less about defining objective reality but thinking of reality in the sense of more complete information. The argument is that being able to ‘summarize’ reality is more effective for survival than absorbing reality in all its fullness.
|
||
|
|
||
|
Oxy
@JEOxendine
|
20 h |
|
Aren’t the fitness payoffs rewards from understanding and focusing on the winnable aspects of reality?
|
||
|
|
||
|
Jonny Miller 🐬
@jonnym1ller
|
5 h |
|
Interesting conclusion!
Perhaps the journey of being human is one of opening up our sensory filters to let greater degrees of that 'reality' seep in.
|
||
|
|
||
|
Christian Nyumbayire
@Chritchen
|
20 h |
|
Great quote. The only exception is when payoffs change in a way that make optimized organisms worse off. But I think there are degrees in "overfitting"
|
||
|
|
||
|
Matt Secoske
@secos
|
20 h |
|
Seeing the truth just means seeing the truth. If you see it, you can still fit.
|
||
|
|
||
|
Tim Parsa
@TimParsa
|
20 h |
|
Solid tweet.
|
||
|
|
||
|
Adam Neumann’s Chief of Staff
@AdamNeumannsCoS
|
18 h |
|
One time I smoked an acid laced doobie while nearing blackout on a Thursday evening in NYC and had this realization while trying to navigate home by the vibe of the streets. Somewhere around prince st I caught a wisp of base reality and immediately regretted it. Never again
|
||
|
|
||
|
Adam Neumann’s Chief of Staff
@AdamNeumannsCoS
|
18 h |
|
Took me days to claw my way back to Inter subjective reality. It’s super hard to spend 15 hours building a prioritization model when all you can think about is how this is all happening so a molecule (which is itself irrelevant) can replicate itself
|
||
|
|
||
|
Noah Thorp
@noahthorp
|
3 h |
|
If each organism maintains world models of less than all information, then no organism knows the "truth" of all information. The fitness payoff maximizations for organisms with equal memory complexity may be unequal - but probably better to avoid calling this "truth".
|
||
|
|
||