NEW: iContact Case Study

How to Ensure your JavaScript is SEO Friendly

 In SEO, Technical SEO

On modern websites, use of JavaScript and Single Page Applications (SPAs) is popular and can be used to create a slick user interface with the efficient use of resources. However, from an SEO standpoint, JavaScript, especially client-side JavaScript, can present serious challenges and may even prevent your page content from being indexed.

On May 10, 2018, at the I/O 2018 conference, Google’s John Mueller and Tom Greenaway gave a presentation on SEO friendly JavaScript. If you’re a developer, the entire video, posted below, is well worth 38 minutes of your time. However, if you’d rather have the short version, here are some of the key takeaways from the presentation.

How does Google crawl JavaScript? Javascript code

Google rarely talks about details of their crawling and indexing, however this time John and Tom made an exception and gave us a look behind the scenes of how Googlebot, Google’s crawler, processes JavaScript:

  • Stateless Rendering – Googlebot’s rendering doesn’t maintain state from one page-load to the next – it views all pages as a new user would see them. Things like service workers (think push notifications and background syncing), local and session storage (data stored locally on in the user’s browser or managed as part of a session), Web SQL or IndexedDB (client-side database storage), cookies and Cache API are not supported.
  • “Headless” Version of Chrome – Googlebot is currently using a headless (no user interface) version of Chrome 41 – circa 2015, which processes JavaScript with ECMAScript 5 (ES5.) It doesn’t support or understand “newer” JavaScript versions, coding conventions or APIs.
  • When there is a mixture of content rendered on the server and client sides, Googlebot crawls and indexes server rendered content, then returns “when it has free resources” to crawl client-side. This can be days or more.
  • Can Googlebot Crawl links in JavaScript? Yes, it certainly can, if the links have an anchor tag and an href attribute.

4 Recommendations for implementing SEO-friendly JavaScript

  1. Server-side rendering – Code that’s rendered server-side is immediately available for Google to crawl and index, without requiring a second pass to render the client-side content. The downside of this is that it uses more server resources up front and will probably slow down the initial page load time. However, this is something you may want to consider for your key content, if it’s an option.
  2. Hybrid rendering – Balancing client and server-side rendering for user experience, performance, and SEO. Use server-side rendering for critical page sections, like the initial page layout and main content, use client-side rendering for non-critical page elements.
  3. Dynamic rendering – This is a new Google policy. You can render the page server-side for Googlebot and user normal client-side rendering for regular visitors. This may sound like “cloaking” – it is, but Google has revised their guidelines to allow it in this situation.  Dynamic rendering is ideal for large and/or rapidly changing sites, where content could be stale by the time Google takes its second pass to render it.
  4. Whichever approach you use, Google recommends practicing “graceful degradation so that visitors using older technology or those who have JavaScript disabled are still able to use your site as much as is feasible.

Things to watch out for:

  • Lazy Loading – Lazy Loaded content (content which loads automatically as soon as the user scrolls to a certain point on the page) may or may not be indexable depending on how it’s implemented. Use noscript tags or structured data.
  • Timeouts – Pages where loading is slow and inefficient pages will be inconsistently rendered and indexed. Limit embedded resources as much as possible and avoid artificial delays like timed interstitials.
  • Click to Load – Googlebot does not normally interact with your page, so content depending on user action to load will not be indexed. You can preload content and use CSS for visibility, or you can use Hybrid or Dynamic rendering (see above.)
  • Any code dependent on preservation of state – As mentioned earlier, Googlebot is a stateless crawler.

How to test

  • Review best practices – Review code for implementation of known best practices for JavaScript / SEO.
  • Fetch as Google – Use the Fetch as Google component of Google Search Console to see the response code and the HTML before rendering.
  • Mobile Friendly test – Google’s Mobile-Friendly Test has a useful new tool to check rendered HTML. Also, make sure API endpoints are crawlable – the Mobile Friendly Test will tell you if any resources are blocked.
  • JS console – Part of Chrome Tools for Web Developers, the JavaScript console is invaluable for examining and debugging JavaScript code.
  • Rich Results test – Standalone or via Google Search Console – test your page to see which rich results can be generated by the structured data it contains.

As promised, here’s the link to the video:

https://youtu.be/PFwUbgvpdaQ

Recommended for you

New Analytics Visual Flow Tools Announced by Google

Quick SEO Fix: Set Your Preferred Domain

The Chrome Extensions Every SEO Needs

New Advanced Segments in Google Analytics

Recent Posts