|
@noahtren | |||||
|
and with deep knowledge tracing, you can do (artificial) supervised learning on your (natural) supervised learning 😛 arxiv.org/abs/1506.05908
|
||||||
|
||||||
|
Emmanuel Ameisen
@mlpowered
|
16. sij |
|
Getting into spaced repetition for memory thanks to @michael_nielsen and @andy_matuschak ’s work.
It feels like unsupervised vs supervised learning
Normal reading is unsupervised
Spaced repetition provides labels you get tested on at successive epochs, to minimize memory loss
|
||
|
|
||