Progressive Web Apps are probably one of the best moves for improving the conversion rate and User Experience of your customers. Using JavaScript as a PWA framework provides the best possible user experience, but it also leaves questions about how to create a crawler-friendly JS application.
With PWAs, you get native-like interactions, push notifications, and home-screen installation. But it’s only part of the picture. In addition, distribution is much easier with standard web-like linkability, shareability – and no-app-store distribution.
From a technical point of view, PWAs are a kind of Single Page Application – JavaScript apps that run within the browser environment, and therefore don’t need to be downloaded or installed.
By its nature, 90% of a PWA is purely JavaScript. Most of the time, developers choose some modern JS frameworks like React, Vue or Angular to provide the user with the best possible experience. But the distribution model is web, for which organic search is vital.
Wait. Organic Search + JavaScript?
The SEO industry isn’t certain if Google treats (and ranks!) JavaScript-based websites and HTML-based sites equally. With this knowledge, it’s clear that SEO and developers are just starting to understand how to make modern JavaScript frameworks crawlable. You may read many tutorials and blog posts on how to create crawler-friendly JavaScript applications. Google is impruving with modern web rendering techniques – however, there are still a lot of limitations.
If we look at web development from an evolutionary point of view, you will see how big of a role JavaScript played in this process. We moved from static websites built with plain HTML to dynamic JavaScript applications.
Google has many limitations in JavaScript execution. Other Search Engines don’t process JS at all. However, it doesn’t mean that we can’t build shiny, interactive websites with modern frameworks that work for the search engines too! – Maria Cieślak – Senior Technical SEO, Elephate
JavaScript SEO challenges
We’ve asked experts from the Elephate SEO agency about their findings and recommendations regarding PWA SEO techniques.
After conducting a series of experiments with the modern JS frameworks vs Google, the findings they presented are really interesting:
1. The fate of Chrome 41
Google parses and renders JS but with heavy limitations – it uses a headless browser based on Chrome 41 (2015’) … If you’re using modern Ecma Script specifications (as most devs currently are) your JS probably should use a lot of shims/fallbacks to provide the same code/links structure to the Googlebot. The worst part is – it’s hard to say which elements will be successfully interpreted and indexed by Google, and which will not.
Unfortunately for PWA app developers, in Chrome 41, interfaces like IndexedDB and WebSQL are disabled. The interfaces are most commonly used for providing your users with offline support.
2. Googlebot is not a browser
The World Wide Web is huge though, so Google optimizes its crawlers for performance. This is why Googlebot sometimes doesn’t visit all the pages the webmasters want.
Most importantly, Google’s algorithms try to detect if a resource is necessary from a rendering point of view. If not, it probably won’t be fetched by Googlebot.
Elephate’s experiments show that Google crawler does not wait longer than 5 seconds for any resource to be downloaded/rendered. Moreover, it could “optimize” your app behavior by not respecting (still pretty common) setTimeout() calls, etc.
Here the Google Indexer just omitted the script and rendered the rest of the page. Source: Elephate blog.
3. Googlebot doesn’t click anything
You should make sure whether or not a menu is apparent in the DOM before clicking on any menu items. If you rely on the onClick event, it can stop the browser from indexing your website structure.
4. Feel the flow
The indexation flow for classical HTML and JS apps differs strongly; to index JS apps Googlebot requires lot more resources and time. Google officially stated that “The rendering of JavaScript-powered websites in Google Search is deferred until Googlebot has resources available to process that content.”
Source: Google I/O ’18
These are just a few examples of the pitfalls you may encounter when working with your app SEO. You can learn many more details about how JavaScript apps are indexed by reading “Ultimate guide to JavaScript SEO”.
SEO techniques in PWAs
OK, so let’s check how we can deal with the JavaScript SEO challenges based on our own case study.
Vue Storefront is a standalone Progressive Web App storefront for your eCommerce, able to connect with any eCommerce backend (eg. Pimcore, Prestashop or Shopware) through the API. It’s made using the Vue.js framework – and yes it’s SPA (Single Page Application), and yes it works well with Google and other crawlers. You can find more info on our Github.
“Following some good JS SEO practices, Single Page Apps (including Progressive Web Apps) can be crawled, rendered and indexed, meaning that they can be SEO-friendly. For example, looking at one of the most popular websites on the web…YouTube. Yes, YouTube is built with JavaScript (visit the website, switch JavaScript off in the browser, and see what happens) and it ranks very well.
Vue Storefront is built with universal JavaScript. It’s one of the most SEO-friendly solutions recommended by Google.” – Maria Cieślak – Senior Technical SEO, Elephate
Win-win situation
Web development tools and techniques are changing – much faster than search engine capabilities to handle all the new features. Google is still using a rendering engine – dating back to 2015 – to render most modern web-applications.
There is a solution. You can separate the content served for the crawler (rendered server-side) from the super-modern CSR (client-side-rendered) version for your users. You can do that. Legally. No consequences 😉
Source: divante.co