|
@AdamMarblestone | |||||
|
This looks pretty good though, no? pic.twitter.com/v92RFxrgRz
|
||||||
|
||||||
|
Gary Marcus
@GaryMarcus
|
30. stu |
|
thread on universally-quantified one to one mappings, which I still see as key bottleneck to future progress. twitter.com/GaryMarcus/sta…
|
||
|
|
||
|
Gary Marcus
@GaryMarcus
|
30. stu |
|
I am not sure it is a general solution but building in operations over variables as NTM's do is the crux of what I have been arguing for.
IIRC i mentioned that specific work approvingly in my Deep Learning Critical Appraisal and follow up on Medium (In Defense of Skepticism).
|
||
|
|
||
|
Adam Marblestone
@AdamMarblestone
|
10. pro |
|
Yes, and FYI I think transformers with “relative position encoding” also arguably build in a syntactic/content independent variable binding mechanism
arxiv.org/abs/1803.02155
|
||
|
|
||
|
Gary Marcus
@GaryMarcus
|
30. stu |
|
to my mind the question has never been is the brain a neural network but rather what kind, and I have always argued that it must be a neural network that had operations over variables.
even when I co-authored with you :)
|
||
|
|
||