Twitter | Pretraživanje | |
Quoc Le 28. sij
New paper: Towards a Human-like Open-Domain Chatbot. Key takeaways: 1. "Perplexity is all a chatbot needs" ;) 2. We're getting closer to a high-quality chatbot that can chat about anything Paper: Blog:
Reply Retweet Označi sa "sviđa mi se"
lerner zhang 29. sij
Odgovor korisniku/ci @quocleix @xpearhead @lmthang
By perplexity do you mean average perplexity? I wonder if a weighted average perplexity would be better?
Reply Retweet Označi sa "sviđa mi se"
Thang Luong
Perplexity for a language model, by definition, is computed by first averaging all neg log predictions and then exponentiating. Does that help explain?
Reply Retweet Označi sa "sviđa mi se" More
Stéphane Guillitte 29. sij
Odgovor korisniku/ci @lmthang @lerner_adams i 2 ostali
Is perplexity calculated on words or BPE subwords ?
Reply Retweet Označi sa "sviđa mi se"
Thang Luong 29. sij
Odgovor korisniku/ci @guillittes @lerner_adams i 2 ostali
It's on subwords (with 8K units).
Reply Retweet Označi sa "sviđa mi se"
lerner zhang 29. sij
Odgovor korisniku/ci @lmthang @quocleix @xpearhead
Danke
Reply Retweet Označi sa "sviđa mi se"
Danny Iskandar 29. sij
Odgovor korisniku/ci @lmthang @lerner_adams i 2 ostali
Is there a simple term for normal people could understand?
Reply Retweet Označi sa "sviđa mi se"