Twitter | Search | |
Jake Bohall Dec 10
Server auto scales resources for demand. G just went from 100 urls/s to 400 urls/s -- killing server. We tried 429, but as soon as it gets pulled, G comes back at 400.. GSC setting option is max 10/s (meh!).. Testing forced delay after 100 requests. Any alt suggestions?
Reply Retweet Like
Jenny Halasz #BlackLivesMatter #MasksForAll Dec 12
Replying to @jakebohall @JohnMu
Old school, but have you tried crawl delay in robots?
Reply Retweet Like
JR Oakes Dec 12
I thought Google removed support for that in September? Haven't tested it though. Jake, you should use this to validate that they removed it!
Reply Retweet Like
Rob Woods 💻🎣🏕️ Dec 12
that was my impression too... I thought they stopped supporting it
Reply Retweet Like
🍌 John 🍌
We never supported crawl-delay in robots.txt -- the numbers given there tend to be unrealistic for the most part.
Reply Retweet Like More
Jake Bohall Dec 13
Replying to @JohnMu @robdwoods and 3 others
Any chance the crawl controls in GSC could some day have a wider range than 0-10 urls/s ;)
Reply Retweet Like
Jenny Halasz #BlackLivesMatter #MasksForAll Dec 12
Replying to @JohnMu @robdwoods and 2 others
Whoops guess I missed that! Thanks guys.
Reply Retweet Like
Dan of the Dead 🎃👻 Dec 12
I think Bing supports it... (cc ?)
Reply Retweet Like
JR Oakes Dec 12
Replying to @JohnMu @robdwoods and 3 others
Me: Does Google even support crawl-delay? <searches: google "crawl-delay"> I blame :-)
Reply Retweet Like
JR Oakes Dec 12
Replying to @JohnMu @robdwoods and 4 others
I guess, cc'ing too, 👋
Reply Retweet Like