|
@JanelleCShane | |||||
|
I’m not kidding about the all-conversations-lead-to-crochet-hats thing either. My prompt is the bit in bold. The neural net did the rest. pic.twitter.com/2DwWBSJ7ue
|
||||||
|
||||||
|
Janelle Shane
@JanelleCShane
|
6. srp |
|
Ravelry folks! In particular, crocheters! I have my first neural net-generated patterns posted to the LSG HAT3000 thread.
Did it work? I have no idea. Some of the crochet notation in the dataset is VERY old, and it's not consistent at all. The results look like patterns maybe? pic.twitter.com/gwIjA8YYG2
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
6. srp |
|
Here's another batch sampled from the same model, but with the top_k truncation set to 0 instead of 40. This allows the model to use lower-probability predictions. They are noticeably lower-probability. pic.twitter.com/QRae3Vy4UM
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
It’s Hang in Wind! Humans cannot comprehend what it does. twitter.com/emersTweeting/…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
Another batch of patterns, this time with truncation top_k = 60, a compromise that produces weird names but more-doable patterns than with truncation turned off. pic.twitter.com/c6joztWg9k
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
If I don't give it a specific prompt, sometimes the neural net starts generating things that it remembers from its original training. I love its recovery method: twist the conversation back toward crochet hats. pic.twitter.com/Wi5N4PvJAv
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
Meanwhile Brim Hat Pattern #1708 is revealing itself to be an all-devouring eldritch horror twitter.com/persipan/statu…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
And if you can wear Crochet Cap Pattern #3615, you are already lost to the mortal realms
twitter.com/manyartsofem/s…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
Here are some more hats from truncation top_k=60. I suspect that some of these don't result in hats? I probably should add more truncation to rein in the creativity a bit, but I'm enjoying these names so much. pic.twitter.com/OdQ38sEQmf
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
Why do all the neural net's hats end up ballooning into universe-devouring monstrosities?
twitter.com/joannastar/sta…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
Decided to try a new method for inducing the neural net to generate interesting pattern titles: a conservative k=40 truncation but prompted with the phrase "Lord Vader!"
Had the desired effect on the titles, as well as unintended effects on some of the patterns themselves. pic.twitter.com/LVHsjPEktF
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
oh no it's going to be space-devouring hyperbolic ripples again, isn't it?
all of these patterns seem to be traps
twitter.com/MiniGirlGeek/s…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
7. srp |
|
tried to avoid hyperbolic chaos by reducing the sampling temperature from 1.0 to 0.8.
now the neural net is playing it safer (arguably TOO safe in those 1st two patterns), but is it still going hyperbolic? pic.twitter.com/dYEeMBEJIV
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
8. srp |
|
Update: these patterns totally are going hyperbolic. If anything, faster. twitter.com/Totesmyname/st…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
9. srp |
|
Apparently exploding hats are nearly inevitable @robin_h_p explains it well in the linked thread - each small excess increase ends up blowing up bc it's multiplied by later rounds. Meanwhile HAT3000 thinks it's still matching human patterns really well.
twitter.com/robin_h_p/stat…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
9. srp |
|
I trained HAT3000 for another 85 iterations to see if it might be able to fix its explodey hat problem. (here sampled with temperature 1.0, top_k 60, prompt text eccentric)
The one called "The End..." looks dangerous even to me. pic.twitter.com/LN4Y4L2d07
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
9. srp |
|
Tom Boy Pocket Bamboozles did not explode! It does have tentacles though twitter.com/lbreeggem/stat…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
10. srp |
|
Meanwhile, Lumpy Top is another hyperbolic brain coral. @PaulaSimone calculates 1605 stitches in each of the outermost rounds. twitter.com/PaulaSimone/st…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
11. srp |
|
Over 3500 stitches in one row. This hyperbolic neural net crochet pattern is now eating both yarn and sanity.
twitter.com/Persipan/statu…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
11. srp |
|
Results from a later training point (281 & 320). Prompting with eccentric text (Nobody expected stegosaurs in spaceships. etc) seemed to get induce it to generate interesting titles. Same temp=1 & top_k=60
I don’t know how it decided to try counting piglets instead of stitches. pic.twitter.com/5R5uyoE7kT
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
21. srp |
|
Brim Hat Pattern #1708 is still devouring yarn. Starting to become mildly terrifying. twitter.com/persipan/statu…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
26. srp |
|
I exported a huge batch of unfiltered HAT3000 patterns - you can read the draft here. docs.google.com/document/d/1Fj…
Note that some of the patterns appear to be, um, explicit. HAT3000 is a descendent of GPT-2, which Saw Some Shit on the internet. Prompt suggestions welcome
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
29. kol |
|
The larger GPT-2-774 model is released, but I don't have a way to finetune it on crochet yet. But I found that when I prompt it with a title + 2 lines of an existing crochet pattern, it has seen enough crochet on the internet that it already knows what to do. pic.twitter.com/6Thj22BMgu
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
4. ruj |
|
Lumpy top looks fabulous with red edge! twitter.com/paulasimone/st…
|
||
|
|
||
|
Janelle Shane
@JanelleCShane
|
5. ruj |
|
I summarized the (faintly disquieting) story of HAT3000 here:
twitter.com/JanelleCShane/…
|
||
|
|
||