|
Andrew Saxe
@
SaxeLab
Oxford, UK
|
|
Sir Henry Dale Fellow at University of Oxford, trying to figure out how we learn.
|
|
|
33
Tweetovi
|
161
Pratim
|
729
Osobe koje vas prate
|
| Tweetovi |
| Andrew Saxe proslijedio/la je tweet | ||
|
bioRxiv Neuroscience
@biorxiv_neursci
|
20. sij |
|
Integration of New Information in Memory: New Insights from a Complementary Learning Systems Perspective biorxiv.org/cgi/content/sh… #biorxiv_neursci
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Michael C. Frank
@mcxfrank
|
27. sij |
|
Exciting example of the emergentist perspective in the domain of number.
Most interesting part: an account of developmental refinement of number sense acuity from experience - this is empirically true but very undertheorized in nativist accounts. twitter.com/MateuszHohol/s…
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Michael Eisen 虫めづるマイケル
@mbeisen
|
22. sij |
|
I have a favor to ask. For 20+ years I've been working on a dream: to make all science funded by US taxpayers freely available to all. We are on the verge of achieving this. But we need to show that people care. So please, if you can, sign this letter:
oaintheusa.com
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Grigori Guitchounts
@guitchounts
|
18. sij |
|
My paper is out on @biorxivpreprint! We explored movement signals in visual cortex and found a lot of surprising things. twitter.com/biorxivpreprin…
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
summerfieldlab
@summerfieldlab
|
17. sij |
|
let's encourage @SfNtweets give this petition the consideration it deserves. please retweet. twitter.com/SfN_climate/st…
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Surya Ganguli
@SuryaGanguli
|
10. sij |
|
A great set of videos on the mathematics of #deeplearning twitter.com/deepmath1/stat…
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Tim Vogels
@TPVogels
|
10. sij |
|
We have a new paper on @biorxiv, up now. I wanted to call it “Crushing the Hopfield limit”, but, sadly, I was overruled. Not much time today (flying to the @isiCNI, yay!), but in brief, WE CRUSH IT (the hopfield limit) (and in a cool cool way) (i’m excited) Ready? pic.twitter.com/qjB8bEQIll
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Steph Nelli
@steph_nelli
|
8. sij |
|
Don’t forget to come to these science talks!
Additionally, on Friday Jan 17 at 10 am @weijima01 will give his great career development talk in NRH - relevant for all stages of researchers!
@OUPPsychology @ResStaffOxford @OxResearchSkill @OxfordCareers @OxExpPsy twitter.com/weijima01/stat…
|
||
|
|
||
|
Andrew Saxe
@SaxeLab
|
7. sij |
|
There's also a great related article in the same issue by @marylougab @zdeborova @KrzakalaF and co: iopscience.iop.org/article/10.108…
|
||
|
|
||
|
Andrew Saxe
@SaxeLab
|
7. sij |
|
Our paper "On the information bottleneck theory of deep learning" has been republished (with small edits) in J Stat Mech ML special issue: iopscience.iop.org/article/10.108… A wonderful collaboration with @whybansal @laika117 @advani_madhu Artemy Kolchinsky @brendantracey @neurobongo pic.twitter.com/kJOljN7U85
|
||
|
|
||
|
Andrew Saxe
@SaxeLab
|
26. pro |
|
As far as genealogy goes you may want to have a look at the affiliations on the backprop nature paper. Or the wonderful PDP books, which read remarkably modern today. I think the genealogy is shared, and that should be celebrated. Interdisciplinary science is great!
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Timothy O'Leary
@Timothy0Leary
|
20. pro |
|
We're hiring: focused 2 year position in collaboration with Chris Harvey (Harvard Med School) and Yaniv Ziv (Weizmann) on reconfiguring neural codes. Please RT/apply here:jobs.cam.ac.uk/job/24566/
|
||
|
|
||
|
Andrew Saxe
@SaxeLab
|
19. pro |
|
When do networks abstract away from different modalities and when don't they? Great insights... twitter.com/beckyJ_1989/st…
|
||
|
|
||
|
Andrew Saxe
@SaxeLab
|
16. pro |
|
But it is the case empirically that large vanilla DNNs trained without regularization from standard initializations can typically wind up at a global minimum (when trained with a bounded loss function).
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Joshua Calder-Travis
@JCalderTravis
|
12. pro |
|
This is such a cool paper. So much explained with so little.
pnas.org/content/116/23… @SaxeLab
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Kyle Cranmer
@KyleCranmer
|
12. pro |
|
Excellent talk by @sdgoldt on using statistical physics to understand dynamics of learning. Impressive results! @zdeborova @KrzakalaF
Poster:
drive.google.com/file/d/1e9m905… pic.twitter.com/vdIT8z9ipZ
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
Helen C Barron
@HelenCBarron
|
11. pro |
|
Vote tactically to stop Boris: check what that means for your postcode here tactical-vote.uk
|
||
|
|
||
|
Andrew Saxe
@SaxeLab
|
11. pro |
|
You bet! Check out saxelab.org/join.html
|
||
|
|
||
| Andrew Saxe proslijedio/la je tweet | ||
|
celestekidd
@celestekidd
|
10. pro |
|
I'm happy to share my comments on the climate for men from my #NeurIPS2019 talk: docdroid.net/u5r0j3S/kidd-n…
|
||
|
|
||
|
Andrew Saxe
@SaxeLab
|
10. pro |
|
Beautiful! twitter.com/KrzakalaF/stat…
|
||
|
|
||