Twitter | Search | |
Henrik Joreteg 11 Sep 18
- Sub 100kb main bundle - 90+ lighthouse perf score - Data-driven sitemap.xml with relevant crawlable URLs. - PWA deployed as static site (ridiculously scalable) Explain to me again why we should SSR and run a huge farm of node servers that renderToString?
Reply Retweet Like
Paul Kinlan
Indexing is delayed for pure client side sides. Google indexer is two-pass, first run is without js, then week later it's with is (or there abouts)
Reply Retweet Like More
Henrik Joreteg 12 Sep 18
Replying to @Paul_Kinlan
That's a real bummer, I did not know this. 😔
Reply Retweet Like
Paul Kinlan 12 Sep 18
Replying to @HenrikJoreteg
turns out running browsers at scale is hard :)
Reply Retweet Like
Peter 12 Sep 18
Why is there such a huge time span between first and second pass?
Reply Retweet Like
A Web Dev in Dublin 12 Sep 18
It's 2 different 'engines'. has a talk online, I think, that can give you more.
Reply Retweet Like
Richard Hearne 12 Sep 18
Would be super to get some further insight/confirmation from web search side. Cc
Reply Retweet Like
Paul Kinlan 13 Sep 18
See the other part of the thread. The video with john in is there.
Reply Retweet Like
Paul Kinlan 12 Sep 18
Replying to @HenrikJoreteg
For context for anyone on the thread:
Reply Retweet Like
🛠 Barry Adams ⌨️ 12 Sep 18
And here's some of my writing on the topic:
Reply Retweet Like
Joseph Scott 8 May 19
Is this still happening after the big Chrome update for the googlebot?
Reply Retweet Like
Henrik Joreteg 8 May 19
One of the Googlebot engineers commented on that. Apparently there's still a delay.
Reply Retweet Like