Twitter | Pretraživanje | |
deepset
Bringing cutting-edge NLP to the industry via open-source. Checkout: and
113
Tweetovi
98
Pratim
236
Osobe koje vas prate
Tweetovi
deepset 3. velj
Odgovor korisniku/ci @huggingface @spacy_io @fastdotai
The new release also includes many other exciting features: Checkpointing & caching, AMP, SageMaker integration, flexible LR schedules, early stopping, cross-validation, windows support and many more! Thanks to all contributors! Details: (4/N)
Reply Retweet Označi sa "sviđa mi se"
deepset 3. velj
Odgovor korisniku/ci @huggingface @spacy_io @fastdotai
is built on top of the great by . With today's release of v 0.4.1, we go a huge step towards framework compatibility by allowing users to convert models seamlessly between FARM <-> transformers and load models from 's model hub. (3/N)
Reply Retweet Označi sa "sviđa mi se"
deepset 3. velj
Odgovor korisniku/ci @huggingface @spacy_io @fastdotai
That's why frameworks should be compatible to each other instead of building borders. While there are good reasons to have different tooling for different user groups/use cases, we should build an ecosystem rather than silos. (2/N)
Reply Retweet Označi sa "sviđa mi se"
deepset 3. velj
is more than just public code. It's a mindset of sharing, being transparent and collaborating across organizations. It's about building on the shoulders of other projects and advancing together the state of technology (1/N) , , ,
Reply Retweet Označi sa "sviđa mi se"
deepset 29. sij
Today's is heavily fueled by the power of . Glad to announce that we are now a member of 's Inception program! Looking forward to even more GPU power and acceleration of our models via & co
Reply Retweet Označi sa "sviđa mi se"
deepset 28. sij
Odgovor korisniku/ci @deepset_ai
It's based on the nice work by Zhengyan Zhang & Xiaozhi Wang 🧡
Reply Retweet Označi sa "sviđa mi se"
deepset 28. sij
Odgovor korisniku/ci @deepset_ai
As we believe in , you can find the public google slides here: Feel free to use it in your own slides & comment missing LMs!
Reply Retweet Označi sa "sviđa mi se"
deepset 28. sij
It's challenging to keep track of all the latest out there. What was again the difference between and ? What's the core idea behind ? Here's a little (not comprehensive) that we use for workshops
Reply Retweet Označi sa "sviđa mi se"
deepset 27. sij
Odgovor korisniku/ci @burkov @seb_ruder
We can also recommend the multilingual XLM-R. Gives impressive results on German
Reply Retweet Označi sa "sviđa mi se"
deepset 27. sij
Odgovor korisniku/ci @howtodowtle @seb_ruder
Maybe the next version :). Our main goal was to make sure people find it. We actually considered ALBERT back then. Good that we didn't pick that one. Would have been a mess :D
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
Sebastian Ruder 27. sij
Transfer learning is increasingly going multilingual with language-specific BERT models: - 🇩🇪 German BERT - 🇫🇷 CamemBERT , FlauBERT - 🇮🇹 AlBERTo - 🇳🇱 RobBERT
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
Sebastian Ruder 27. sij
New NLP News: NLP Progress, Restrospectives and look ahead, New NLP courses, Independent research initiatives, Interviews, Lots of resources (via )
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
Ivan Bilan 20. sij
GitHub Repo Spotlight №3: Transfer Learning library for NLP called FARM:: With FARM you easily use BERT, XLNet, and others easily for any downstream NLP tasks. FARM is great for fast prototyping too.
Reply Retweet Označi sa "sviđa mi se"
deepset 17. sij
Are you doing in a non-english language? Try the multilingual XLM-R model! It gave us amazing results in German (for the SOTA chasers: yes, it's also outperforming previous results with BERT & Co). Blog:
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
PyTorch 15. sij
v1.4: customizable mobile builds, Distributed Model Parallelism via experimental RPC API, Java Bindings, Chaining LRSchedulers Summary: Release Notes: Last release for Python 2 (bye bye!)
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
Google AI 16. sij
Introducing Reformer, an efficiency optimized architecture, based on the Transformer model for language understanding, that can handle context windows of up to 1 million words, all on a single accelerator with only 16GB of memory. Read all about it ↓
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
Alon Talmor 1. sij
We present our new year special: “oLMpics - On what Language Model pre-training captures״, , Exploring what symbolic reasoning skills are learned from an LM objective. We introduce 8 oLMpic games and controls for disentangling pre-training from fine-tuning.
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
Sebastian Ruder 18. pro
Great to see VCs being excited about NLP. Recent examples: - 's investment in : - 's investment in : - "Entering the Golden Age of NLP" by :
Reply Retweet Označi sa "sviđa mi se"
deepset proslijedio/la je tweet
DeepMind 16. pro
What does it mean to understand language? We argue that human-like understanding requires complementary memory systems and rich representations of situations. A roadmap for extending ML models toward human-level language understanding:
Reply Retweet Označi sa "sviđa mi se"
deepset 15. pro
As promised: here are the slides from Malte's talks in Warsaw! - Keynote at : - Talk at HumanTech: Reach out to us if you have some large polish text data set (> 10GB) and want to train a polish BERT or ALBERT.
Reply Retweet Označi sa "sviđa mi se"