Twitter | Search | |
distillpub
Supporting Clarity in Machine Learning.
116
Tweets
81
Following
21,013
Followers
Tweets
distillpub retweeted
Trask Sep 26
Optimizers like SGD, Adam, and Adagrad are very mysterious things. has created which get you that *gut instinct* of how they work!!! For - learn by playing with these visualizations! (yes! play!)
Reply Retweet Like
distillpub retweeted
Brandon Rohrer Sep 26
Thank you to Gabriel Goh for this illustrated interactive explanation of the alphabet soup of optimization methods used in neural networks. I was just trying to wrap my head around these equations two days ago. This helped tremendously.
Reply Retweet Like
distillpub Aug 14
Distill Update 2018 — Things that worked well, ways we aim to improve, and some policy changes.
Reply Retweet Like
distillpub Jul 31
Reply Retweet Like
distillpub retweeted
Ben Sprecher Jul 26
is a treasure. This new article on differentiable image parameterizations is mesmerizing, beautiful, interactive, informative, fascinating, and incredibly understandable. Wow.
Reply Retweet Like
distillpub retweeted
Alex Mordvintsev Jul 25
This is the main thing I was working on during the last year, together with and
Reply Retweet Like
distillpub Jul 25
Differentiable Image Parameterizations - A new Distill article by & A powerful, under-explored tool for neural network visualizations and art.
Reply Retweet Like
distillpub Jul 9
Feature-wise transformations - A new Distill article by , Ethan Perez, , Florian Strub, , Aaron Courville, Yoshua Bengio
Reply Retweet Like
distillpub May 29
Replying to @NospmasD
(Just to clarify, is an independent academic journal, although we're grateful for the support of many industry and academic research groups.)
Reply Retweet Like
distillpub retweeted
Chris Olah May 18
It always bugged me that we didn't make this diagram in the feature visualization appendix clickable. Well, now it is: I expect very few people will notice, but it's satisfying for things to be "right".
Reply Retweet Like
distillpub retweeted
Ludwig Schubert May 18
At we're super serious about easy reproducibility. This serious. ;-)
Reply Retweet Like
distillpub May 16
Why we're asking: we want examples of papers to aspire towards on different dimensions for the new Distill review process.
Reply Retweet Like
distillpub May 16
Replying to @agharashyam
Do you mean this paper? What aspects of it strike you as extraordinary?
Reply Retweet Like
distillpub May 16
Replying to @denzil_correa
Do you have specific examples of papers that do a great job of this? We're especially interested in examples that our authors could aspire towards.
Reply Retweet Like
distillpub May 16
What are examples of papers that do something extraordinary in an aspect of paper writing that’s normally secondary? What does it even look to really blow out of the water... * reviewing related work? * facilitating reproduction? * diagrams? * writing style? * etc
Reply Retweet Like
distillpub May 15
and Distill? We may have a style guide at some point that could be cited.
Reply Retweet Like
distillpub May 15
We're redesigning the Distill peer review process and would love community comments on our draft reviewer worksheet:
Reply Retweet Like
distillpub May 15
The Distill Template and source for all articles is open source: Those equations are aligned divs with latex in some and text in others.
Reply Retweet Like
distillpub retweeted
Arvind Satyanarayan May 2
As an HCI researcher, I'm heartened by & advocating for more human-centric perspectives in ML research. I'm excited to help make the premier venue for publishing these ideas and anticipate benefits to articulating them within an interactive medium
Reply Retweet Like
distillpub May 2
We're really excited that has accepted a position as a Distill editor! Arvind, who is about to start as a professor at MIT CSAIL, will have a special focus on articles at the intersection of ML & HCI.
Reply Retweet Like