|
Kai Ueltzhöffer
@
KaiUeltzhoeffer
Heidelberg, Deutschland
|
|
Physicist and physician doing theoretical and cognitive neuroscience. Tweets on neuroscience, maths, physics, biology, evolution, and occasionally random stuff.
|
|
|
132
Tweetovi
|
1.060
Pratim
|
274
Osobe koje vas prate
|
| Tweetovi |
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
2. velj |
|
Here‘s a very nice explanation of the Ames illusion by some of my childhood heroes from the curiosity show (from which also the original clip was taken): youtu.be/DkVOIJAaWO0
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
Rick Adams
@dr_rick_adams
|
27. sij |
|
Some interesting papers around at the moment suggesting that classic perception and cognition biases/effects are attributable to noisy inference. This one tackles prospect theory! twitter.com/KJuechems/stat…
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
summerfieldlab
@summerfieldlab
|
29. sij |
|
Today I will be teaching my undergrad course "How to build a brain from scratch" for the 2nd year running. I've put the materials online - include a document with all lecture slides and notes, which is about as long as a decent novel. Enjoy! humaninformationprocessing.com/teaching/
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
27. sij |
|
Very interesting, indeed. Thank you for the pointer.
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
25. sij |
|
Thank you. I'm glad you liked it. I actually just added a small section on the connection between information theoretic and thermodynamic entropy, following a recent discussion with @NoahGuzman14 and @neuropoetic twitter.com/KaiUeltzhoeffe…
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
Quentin Huys
@docqhuys
|
23. sij |
|
Thank you @payampiray for an excellent #TCPW talk on Hierarchical Bayesian Inference for concurrent model fitting and comparison. The talk is now available online: quentinhuys.com/tcpw/news along with many others.
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
Quantitative Biology
@BioPapers
|
22. sij |
|
Active inference on discrete state-spaces: a synthesis. arxiv.org/abs/2001.07203
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
Grigori Guitchounts
@guitchounts
|
18. sij |
|
My paper is out on @biorxivpreprint! We explored movement signals in visual cortex and found a lot of surprising things. twitter.com/biorxivpreprin…
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
Quentin Huys
@docqhuys
|
20. sij |
|
Hierarchical Bayesian inference for concurrent model fitting and comparison for group studies - join us for #TCPW online talk by @payampiray this Thursday 23rd Jan at 4pm UTC cmod4mh.com
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
John Carlos Baez
@johncarlosbaez
|
9. sij |
|
Over 100 people showed up to the first MIT class on Programming with Categories - taught by @BartoszMilewski, David Spivak, and Brendan Fong! You can watch this and all future classes on YouTube.
Later they will write a book.
(1/n)
youtube.com/watch?time_con…
|
||
|
|
||
| Kai Ueltzhöffer proslijedio/la je tweet | ||
|
CLaE
@leafs_s
|
4. sij |
|
Neuron
November 18, 2019
Excitatory and Inhibitory Subnetworks Are Equally Selective during Decision-Making and Emerge Simultaneously during Learning
cell.com/neuron/fulltex…
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
5. sij |
|
You‘re welcome. :)
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
5. sij |
|
P.P.S.: These slides are free and go in the right direction, but I don't know how helpful they are without any contexts: faculty.poly.edu/~jbain/physinf… It, if you can dig up something nicer/more comprehensive, I'd be really interested, @NoahGuzman14. :)
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
5. sij |
|
P.S.: If you want to dive deeper into the thermodynamic waters, as a student I really liked Wachter & Hoeber's Compendium of Theoretical Physics, as it gives both the statistical and the classical derivations of the thermodynamic quantities and shows how they are related.
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
5. sij |
|
perspective. On the other hand, some good initial reading on information theory and variational inference might be the introductory chapters in Chris Bishop's book on "Pattern Recognition and Machine Learning." 2/2 (n=2)
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
5. sij |
|
I'm sorry, right now I can't think of a single resource connecting all the dots. I searched quite a bit on the internet right now, but I'd still suggest the paper by Jeffery, Pollack & Rovelli and the first chapter of arxiv.org/abs/1311.0813 for the thermodynamic 1/n
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
4. sij |
|
Thank you for the same. I totally forgot to elaborate on the entrop*ies part, when I wrote the blog. I’ll try to add it soon(ish). Otherwise I agree that there is still much to think about and also to elaborate more (mathematically) clearly.
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
4. sij |
|
entropy of the distribution of its states on state space. This is usually where the "classical" argument by Karl Friston et al. (e.g. fil.ion.ucl.ac.uk/~karl/Action%2…) starts. So please keep in mind, not only free energy, but also entropy has multiple meanings. 5/5 (n=5)
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
4. sij |
|
for many chemical cycles to work, this requires a Markov blanket, separating changing external from stable internal states. To be able to maintain this non-equilibrium steady state, however, the system implementing the cycle/engine has to minimize the information-theoretic 4/n
|
||
|
|
||
|
Kai Ueltzhöffer
@KaiUeltzhoeffer
|
4. sij |
|
manifest in the form of chemical cycles or cyclic engines. Thus, to dissipate large amounts of free energy, they have to persist over extended periods of time, which requires a driven, non-equilibrium steady state of the system. As a stable internal milieu is very important 3/n
|
||
|
|
||