|
@nostalgebraist | |||||
|
Here's my extended critique of @GaryMarcus and (indirectly) @sapinker on GPT-2, as someone who grew up reading them. tl;dr: deep learning *has* followed the programme of The Algebraic Mind, yet no one's circled back to collect the psycholinguistic insights nostalgebraist.tumblr.com/post/189965935…
|
||||||
|
||||||
|
Gary Marcus
@GaryMarcus
|
31. pro |
|
honored you read my most cryptic work as a teen.
have you seen my December NeurIPS talk? a preanswer to your essay. starts at 9:55 in slideslive.com/38922192/conte…
f* write by doing a sophisticated extension to cut & paste: yes; f* build a model of what it is talking about: no.
|
||
|
|
||
|
nostalgebraist
@nostalgebraist
|
31. pro |
|
i agree, it doesn't do world-modeling. it's a model of fluency in isolation -- which is interesting, and always has been! after all, there's no such thing as a wug, ideas can't be colorless OR green, and the rule-and-memory model doesn't (need to) understand the verbs it inflects
|
||
|
|
||
|
Alec Resnick
@aresnick
|
31. pro |
|
I loved reading this post and am very much enjoying reading _The Algebraic Mind_ now. Thank you!
|
||
|
|
||
|
Gary Marcus
@GaryMarcus
|
31. pro |
|
ps i have a half-written essay very much on this topic that got delayed by the debate and the above-mentioned talk. stay tuned.
|
||
|
|
||
|
Brad Wyble
@bradpwyble
|
31. pro |
|
Point of correction: [Gary] has become a polemicist against “deep learning.”
This is not at all true. Gary's been very clear that he values DL.
|
||
|
|
||
|
KS
@ks445599
|
31. pro |
|
DL? It's inductive. 👎
|
||
|
|
||
|
Jeroen Vlek
@jeroenvlek
|
14. sij |
|
Even if GPT-2 achieves all that via a "blind alley", that can still mean that we have language generation mostly covered and we need to start experimenting with connecting it to other architectures to add the missing competence. PS: YOU can fucking write. Bravo.
|
||
|
|
||