Until Google officially announces it can crawl content rendered through JavaScript, Prerender is the best solution yet

May 21, 2015 15:14 GMT  ·  By

Since 2012 rumors have always been flying around that Google can crawl JavaScript, but no official statement was ever released, probably due to either poor results or the constant setbacks the search engine was probably facing when dealing with a rapidly evolving language like JavaScript.

In the last decade, no other programming language has risen from the depths of the Web development world to the heights and success that JavaScript has. From a quirky scripting language that was generally used to animate menu options in the user's browser, JavaScript is now a server-side interpreted language with servers and powerful middleware that can outperform even some classic LAMP stack setups.

But running efficient servers is not what JavaScript is truly really great at. JavaScript is at its best when used to create fully interactive, highly dynamic Web applications, rendered based on a plethora of well-designed client-side criteria. The language gives developers full control in writing Web, desktop, mobile applications and websites that adapt to the client's input and render code and Web pages in real-time.

While highly useful, this feature creates immense problems in the realm of SEO where search engine crawlers access a website, download the HTML page and index it for later queries. Since search bots don't do anything to interact with the page, most of the times dynamically rendered websites often get ignored in search results.

There's still hope for JavaScript-developed applications

The Prerender JavaScript library for Node.js is a solution for dealing with this kind of dynamic website setups, existing in a middle point between the search engine crawlers and the Web server.

It works by pre-compiling a Web page into static HTML based on the URL route the search crawler is trying to access, and then sending the search engine a simpler version of the page, which afterwards it will cache if you desire so. If a user tries to view the website, regardless of URL and existing cache, Prerender will let him pass and enjoy the site's content in its full glory.

The technology behind Prerender is made up of two main sections, one being a Node.js library that gets installed on the server itself and pre-renders the Web pages into HTML via PhantomJS, and the other part being the middleware library that stands between the server and the search engine crawler.

Official middleware libraries are available for servers running on Node.js (for Express), Ruby on Rails, Apache and NginX, while unofficial libraries are available for Grails, Java, PHP, Python, Scala, and ASP.NET.