Twitter | Pretraživanje | |
Daniël de Kok 🦀 ❄️
Researcher in computational linguistics, c(rust)acean 🦀, Nix/NixOS ❄️, occasional tinkerer with electronics. Dad of a Lego queen 👸. Opinions are my own.
4.403
Tweetovi
129
Pratim
269
Osobe koje vas prate
Tweetovi
Daniël de Kok 🦀 ❄️ 31. sij
Just pushed the version 0.1.0 of the `sentencepiece` crate, which provides a rustic interface to the sentencepiece unsupervised text tokenizer.
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 30. sij
Back home from a great , many thanks to the organizers!
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 26. sij
Nix is so extremely nice. The PyTorch 1.4.0 headers are not compatible with gcc 9. Switch to the gcc8Stdenv build environment, and we can pretend as if gcc 9 never happened.
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ proslijedio/la je tweet
$ zimbatm 25. sij
direnv 2.21.0 has been released. This is a massive release! * `.envrc` files are now loaded with `set -euo pipefail` which is probably going to expose issues in existing scripts. * Ctrl-C now actually works during reload in bash and zsh. And more!
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ proslijedio/la je tweet
Laurent Mazare 25. sij
Our and bindings for are now compatible with PyTorch 1.4, updated opam package/crate are available! A bunch of examples can be found on GitHub, RNNs, GAN, Reinforcement Learning...
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 25. sij
Odgovor korisniku/ci @gideondk @agonzalezro
Keep it short, keep it sweet. The goals is to convince an interviewer to do a follow-up, not to tell your life story. I agree with that burial in resumees is often not really a thing in IT outside FANG-like companies.
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 25. sij
Dear lazyweb, I mean fellow crustaceans. I am working on a crate that binds a C library, but can only reasonably unit-test it by providing a ~500KB data file. Would you find this acceptable for a tiny binding? Alternative: feature-gate those tests.
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ proslijedio/la je tweet
Ed Hawkins 6. sij
Australia: you have just experienced the future.
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ proslijedio/la je tweet
Rutger Bregman 20. sij
For some reason, I haven't been invited to Davos this year. Was it something I said? 🤔
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 18. sij
Odgovor korisniku/ci @zimbatm
Nix, magit, Emacs evil, direnv, tmux, fzf
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 18. sij
0.45 dependency parsing LAS improvement on Dutch, with only half the parameters. That's what I call a productive week 🥂. Enjoy the weekend!
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ proslijedio/la je tweet
Saoirse Shipwreckt 15. sij
Firefox market share vs Mozilla Foundation chair salary (2.5 million/year in 2018)
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ proslijedio/la je tweet
Michal Strehovský 9. sij
1/7 Did you ever need to run a piece of C# code on Windows 3.11? Me neither, but I did it anyway. A thread.
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 9. sij
Finally made the switch from Tensorflow to PyTorch. The tch-rs bindings for libtorch are really nice. Learning the libtorch API and porting the BERT model from Hugging Face Transformers to Rust was really quick:
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 8. sij
I am very happy that we use far less gas for heating/hot water than the average of all house types. But that bar chart is weird. The deltas are probably proportional, but the bars themselves…
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 7. sij
Odgovor korisniku/ci @kowey
They rolled out a large fleet here in Groningen as well. They are awesome for cyclists! No more burnt diesel exhaust when you are cycling behind a bus!
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 6. sij
Odgovor korisniku/ci @o0ignition0o @dues__ @rustlang
Why do you commit Cargo.lock for library crates, given that they are ignored by Cargo when the library is used as a dependency? Documentation?
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 6. sij
Odgovor korisniku/ci @danieldekok
The final, finetuned models, use each layer almost equally. The interesting patterns here seem to be: for lemmatization, almost all layers are used equally. For the other tasks, use increases per layer. 4/4
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 6. sij
Odgovor korisniku/ci @danieldekok
However, for morphological and POS tagging, it seems that for ML BERT the classifiers rely most on the initial/middle layers, whereas for BERTje more uniformly on all the layers. 3/4
Reply Retweet Označi sa "sviđa mi se"
Daniël de Kok 🦀 ❄️ 6. sij
Odgovor korisniku/ci @danieldekok
This can give hints where information is represented in the initial encoder. Interestingly, the information seems to be distributed quite differently between ML BERT and BERTje. In both, the last layers are used most for dependencies and the initial layers for lemmatization. 2/4
Reply Retweet Označi sa "sviđa mi se"