Twitter | Pretraživanje | |
nostalgebraist
Here's my extended critique of and (indirectly) on GPT-2, as someone who grew up reading them. tl;dr: deep learning *has* followed the programme of The Algebraic Mind, yet no one's circled back to collect the psycholinguistic insights
(The title of this post is a joking homage to one of Gary Marcus’ papers.) I’ve discussed GPT-2 and BERT and other instances of the Transformer architecture a lot on this blog.  As you can probably...
Reply Retweet Označi sa "sviđa mi se" More
Gary Marcus 31. pro
Odgovor korisniku/ci @nostalgebraist @sapinker
honored you read my most cryptic work as a teen. have you seen my December NeurIPS talk? a preanswer to your essay. starts at 9:55 in f* write by doing a sophisticated extension to cut & paste: yes; f* build a model of what it is talking about: no.
Reply Retweet Označi sa "sviđa mi se"
nostalgebraist 31. pro
Odgovor korisniku/ci @GaryMarcus @sapinker
i agree, it doesn't do world-modeling. it's a model of fluency in isolation -- which is interesting, and always has been! after all, there's no such thing as a wug, ideas can't be colorless OR green, and the rule-and-memory model doesn't (need to) understand the verbs it inflects
Reply Retweet Označi sa "sviđa mi se"
Alec Resnick 31. pro
Odgovor korisniku/ci @nostalgebraist
I loved reading this post and am very much enjoying reading _The Algebraic Mind_ now. Thank you!
Reply Retweet Označi sa "sviđa mi se"
Gary Marcus 31. pro
Odgovor korisniku/ci @nostalgebraist @sapinker
ps i have a half-written essay very much on this topic that got delayed by the debate and the above-mentioned talk. stay tuned.
Reply Retweet Označi sa "sviđa mi se"
Brad Wyble 31. pro
Odgovor korisniku/ci @nostalgebraist @GaryMarcus @sapinker
Point of correction: [Gary] has become a polemicist against “deep learning.” This is not at all true. Gary's been very clear that he values DL.
Reply Retweet Označi sa "sviđa mi se"
KS 31. pro
Odgovor korisniku/ci @nostalgebraist @EliSennesh i 2 ostali
DL? It's inductive. 👎
Reply Retweet Označi sa "sviđa mi se"
Jeroen Vlek 14. sij
Odgovor korisniku/ci @nostalgebraist @GaryMarcus @sapinker
Even if GPT-2 achieves all that via a "blind alley", that can still mean that we have language generation mostly covered and we need to start experimenting with connecting it to other architectures to add the missing competence. PS: YOU can fucking write. Bravo.
Reply Retweet Označi sa "sviđa mi se"