Twitter | Pretraživanje | |
Hugging Face
Solving NLP one commit at a time!
1.050
Tweetovi
20
Pratim
13.881
Osobe koje vas prate
Tweetovi
Hugging Face proslijedio/la je tweet
Julien Chaumond 8 h
... which is why we should mutualize trainings and SHARE OUR WEIGHTS!
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Gabriel Altay 7 h
Taking the new Rust tokenizers for a spin on the Kensho Derived Wikimedia Dataset (KDWD). Get a 14x speedup using BERT word piece tokenizer
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Teven Le Scao 6 h
Hi world! I'm excited to start working in Paris as a research engineer this week :)
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Facebook AI 7 h
Facebook AI's latest research on multimodal bitransformers is the first multimodal model to be part of the library! Released here:
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
clara ma 8 h
Hello Twitterverse 👋🏻 Started my second day as Chief of Staff at 🤗 & I'm so excited to delve into the world of NLP. My favorite use case of Transformers 🤖:
Reply Retweet Označi sa "sviđa mi se"
Hugging Face 9 h
Odgovor korisniku/ci @huggingface
➡️ Your model has a page on and everyone can load it using AutoModel.from_pretrained("username/model_name")
Reply Retweet Označi sa "sviđa mi se"
Hugging Face 9 h
Odgovor korisniku/ci @huggingface
1/ Call `.save_pretrained("model_name")` 2/ Create an account on and use the CLI to upload 3/ Add a model card to the repo with: model description, training details (dataset/preprocessing/hyper-params/...), eval results, intended use/limitations...
Reply Retweet Označi sa "sviđa mi se"
Hugging Face 9 h
A few people asked us to talk more about our model sharing feature, so here goes:
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Thomas Wolf 4. velj
François () is both an independent ML researcher investigating sparse efficient models and... an early angel investor in 🤗 Happy that he agreed to share some of his knowledge and experience on sparse models in a series of posts! First one is here
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
François Lagunas 4. velj
I just published my first post on Medium "Is the future of Neural Networks Sparse?" . Enjoy!
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Julien Chaumond 4. velj
Guest blog post from angel investor on sparse tensors in neural nets. I am VERY excited by what's going to happen in sparsity-land this year 🔥
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Martin Malmsten 4. velj
We just released a Swedish BERT model, an alpha ALBERT-model and a BERT fine-tuned for NER based on texts from a wide range of sources (news, books, wikipedia, legal e-deposits, goverment publications, etc.).
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Anthony Goldbloom 4. velj
Kaggle forum mentions tend to be a leading indicator of what takes off. Our forums are closely followed: each topic receives an average of 400 views. 1/3
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Anthony Goldbloom 4. velj
Odgovor korisniku/ci @huggingface
Looking at some of the things that are generating excitement at the moment: BERT, TPUs and are showing the most momentum. Anything else I should be querying? (Note: unlike previous Tweets in this series, these are quarterly mentions.) 4/3
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Victor Sanh 3. velj
Thanks for having us and the interesting discussion! We were honored to be guests on arguably the best ML/NLP podcast out there!
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Matt Gardner 3. velj
104: and talk to us about model distillation, when you try to approximate a large model's decision boundary with a smaller model. After talking about the general area, we dive into DistilBERT.
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Julien Chaumond 3. velj
Odgovor korisniku/ci @Wietsedv @douwekiela i 5 ostali
New models from: - (Dutch BERT), - at Facebook AI (MMBT, multi-modal model) - , et al. (FlauBERT, French-trained XLM-like) - , et al. at (UmBERTo, Italian CamemBERT-like)
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Julien Chaumond 3. velj
The 2.4.0 release of transformers is **𝐌𝐀𝐒𝐒𝐈𝐕𝐄** thanks to our amazing community of contributors. 🔥
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
Martin Krallinger 3. velj
Thomas Wolf (HuggingFace Inc) talking on Transfer Learning in NLP at NLPL . Tutorial in Naacl and emnlp, code and model sharing, transformer libraries
Reply Retweet Označi sa "sviđa mi se"
Hugging Face proslijedio/la je tweet
jeremy barnes 3. velj
Odgovor korisniku/ci @huggingface
Now, Thomas Wolf is going to tell us a bit about transfer learning research from
Reply Retweet Označi sa "sviđa mi se"