- Stateless Rendering – Googlebot’s rendering doesn’t maintain state from one page-load to the next – it views all pages as a new user would see them. Things like service workers (think push notifications and background syncing), local and session storage (data stored locally on in the user’s browser or managed as part of a session), Web SQL or IndexedDB (client-side database storage), cookies and Cache API are not supported.
- When there is a mixture of content rendered on the server and client sides, Googlebot crawls and indexes server rendered content, then returns “when it has free resources” to crawl client-side. This can be days or more.
- Server-side rendering – Code that’s rendered server-side is immediately available for Google to crawl and index, without requiring a second pass to render the client-side content. The downside of this is that it uses more server resources up front and will probably slow down the initial page load time. However, this is something you may want to consider for your key content, if it’s an option.
- Hybrid rendering – Balancing client and server-side rendering for user experience, performance, and SEO. Use server-side rendering for critical page sections, like the initial page layout and main content, use client-side rendering for non-critical page elements.
- Dynamic rendering – This is a new Google policy. You can render the page server-side for Googlebot and user normal client-side rendering for regular visitors. This may sound like “cloaking” – it is, but Google has revised their guidelines to allow it in this situation. Dynamic rendering is ideal for large and/or rapidly changing sites, where content could be stale by the time Google takes its second pass to render it.
Things to watch out for:
- Lazy Loading – Lazy Loaded content (content which loads automatically as soon as the user scrolls to a certain point on the page) may or may not be indexable depending on how it’s implemented. Use noscript tags or structured data.
- Timeouts – Pages where loading is slow and inefficient pages will be inconsistently rendered and indexed. Limit embedded resources as much as possible and avoid artificial delays like timed interstitials.
- Click to Load – Googlebot does not normally interact with your page, so content depending on user action to load will not be indexed. You can preload content and use CSS for visibility, or you can use Hybrid or Dynamic rendering (see above.)
- Any code dependent on preservation of state – As mentioned earlier, Googlebot is a stateless crawler.
How to test
- Fetch as Google – Use the Fetch as Google component of Google Search Console to see the response code and the HTML before rendering.
- Mobile Friendly test – Google’s Mobile-Friendly Test has a useful new tool to check rendered HTML. Also, make sure API endpoints are crawlable – the Mobile Friendly Test will tell you if any resources are blocked.
- Rich Results test – Standalone or via Google Search Console – test your page to see which rich results can be generated by the structured data it contains.
As promised, here’s the link to the video:
Recommended for you
New Analytics Visual Flow Tools Announced by Google
Quick SEO Fix: Set Your Preferred Domain
The Chrome Extensions Every SEO Needs