|
@jaaanaru | |||||
|
Brains are amazing. Our lab demonstrates that single human layer 2/3 neurons can compute the XOR operation. Never seen before in any neuron in any other species. Out now in @sciencemagazine. Congrats Albert, Tim @mattlark @YiotaPoirazi & CO science.sciencemag.org/content/367/64…
|
||||||
|
||||||
|
Subutai Ahmad
@SubutaiAhmad
|
2. sij |
|
Great paper. I'm a bit confused about the novelty of the XOR result though. As I understand it, Yiota's model from 2003 (where a pyramidal cell was equivalent to a 2-layer network) in cell.com/neuron/fulltex… would also be able to solve the XOR problem.
|
||
|
|
||
|
Jaan Aru
@jaaanaru
|
2. sij |
|
:) But this is *real* stuff, real neurons, real currents and real spikes, so nobody can complain that Albert "simply made up" something in a model by tuning parameters etc :)
|
||
|
|
||
|
Gary Lupyan
@glupyan
|
3. sij |
|
Super Interesting! I haven't had a chance to read this yet, but what's the evidence that this is about human layer 2/3 neurons in particular?
|
||
|
|
||
|
Jaan Aru
@jaaanaru
|
3. sij |
|
During the last 2 decades there has been quite some work on rodent layer 2/3 neurons with similar techniques. But in principle it is of course possible that in some other species (primates) this phenomenon will one day be observed. So far it hasn't been.
|
||
|
|
||
|
Christian Varela
@vgman94
|
2. sij |
|
I have an idea what this means, but could someone please explain this in more layman terms?
|
||
|
|
||
|
Your Caucus Reporting App, powered by Blockchain
@kylewadegrove
|
2. sij |
|
1/ The XOR function (given two predicates A and B, XOR(A,B) = T if A=T and B=F, or vice versa) has been a focus for computational neuroscience since Minsky and Papert showed that a single linear perceptron was incapable of learning it.
|
||
|
|
||
|
Anders Sandberg
@anderssandberg
|
2. sij |
|
Maybe there is hope for the perceptrons yet.
|
||
|
|
||
|
Ron🔹
@TechRonic9876
|
2. sij |
|
Single neurons already have multiple inputs and outputs… they’re probably more like a multilayered perceptron on their own
And I bet there are other neuron types that specialize in convolutions and permutations and other “primitives” operations that im uneducated about
|
||
|
|
||
|
Thomas Hannagan
@ThomasNeuro
|
2. sij |
|
This looks terrific, except for the paywall..
|
||
|
|
||
|
𝕋𝕙𝕖 𝕆𝕟𝕖 𝕚𝕞𝕡𝕣𝕠𝕧𝕚𝕟𝕘 θ
@The_One_WR
|
2. sij |
|
Makes me miss Aaron Swartz
|
||
|
|
||