Twitter | Search | |
Ian Tenney
NLP/NLU at Google Research
6
Tweets
144
Following
244
Followers
Tweets
Ian Tenney retweeted
Sam Bowman May 14
Replying to @W4ngatang
"How to Get Past Sesame Street: Sentence-Level Pretraining Beyond Language Modeling" w/ and our whole JSALT 2018 team () No preprint yet, but it's loosely based on our earlier 'ELMo's Friends' manuscript:
Reply Retweet Like
Ian Tenney May 15
Replying to @iftenney
3/3 And you can assemble the layer-wise probes to see how hypotheses develop on individual sentences:
Reply Retweet Like
Ian Tenney May 15
Replying to @iftenney
2/3 Learned mixing weights & cumulative scores from probing models show which layers best represent common NLP tasks. Left-to-right is layer 0 (embeddings) through layer 24 of BERT-large:
Reply Retweet Like
Ian Tenney May 15
1/3 Check out our draft on arXiv! (with and Ellie Pavlick)
Reply Retweet Like
Ian Tenney Feb 24
Replying to @iftenney
Now updated with BERT results!
Reply Retweet Like
Ian Tenney Dec 20
I suppose I should tweet now :) Check out our probing work at ICLR! joint work with , Ellie Pavlick, , and the JSALT Sentence Representations team.
Reply Retweet Like