AngularJS is a good framework for building websites and apps because it makes them much faster and richer for users.
But there’s one problem every developer faces when pushing their AngularJS product: Search Engines Optimization, or SEO.
Quality SEO means getting found amidst all the noise online. When a site or app is optimized for search it is more likely to be found by prospective users. If it is not optimized, then a developer might as well be screaming at the wind – no exposure and no users are almost guarantees.
Right now the most common techniques for mitigating this problem is to either build separate versions of the app or website for search engines, or to pre-render pages on the server. The problem with these solutions, however, is that you have to maintain two separate systems: one for your users and another for Google and the other search engines.
But how exactly are these snapshots created? And how can a developer be sure that their AngularJS app or website is correctly and completely indexed by Google?
In this post we present a free and self hosted solution to generate snapshots and to make sure that your AngularJS website or application crawlable by, indexed by, and optimized for Google.
AngularJS and SEO: The Problem
Search engines crawlers were originally designed to index the HTML content of web pages.
AngularJS and SEO: The Solution
Overcoming the indexing problem is not difficult when developers embrace what are called ‘snapshots’.
Snapshots is a term used to refer to content generated for the search engine crawlers on the website’s backend. The idea behind snapshots is that the developer does the work for the crawler that it cannot or doesn’t want to do on it’s own. Optimizing and caching snapshots not only help you get indexed, but also improves significantly the speed of indexation.
So how do you generate snapshots, and how do you work with them to make sure you are indexed?
Read on for the step-by-step guide.
Step One: Generate Snapshots
The first step is to generate the snapshots themselves.
To do this we need access to a snapshot server based on a headless browser such as PhantomJS or ZombieJS. In this example we will use the open source middleware Prerender that already packages PhantomJS and is ready to handle our special crawler requests and serve HTML snapshots.
In order to reduce the time, it takes to generate snapshots a cache can be employed. Snapshots are cached on a Redis Server the first time they are requested, and then re-cached once a day (note: this can be manually configured to suit your needs) to make sure the content stays up-to-date. As a result, a static snapshot is always and instantly available to be served to the crawler.
Step 2: Server Installation
In this example we will use an Apache server run on Ubuntu 14.04.2 LTS.
There are five sub-steps to work through here.
1 – Install NPM and NodeJS
sudo apt–get update
sudo apt–get install nodejs npm
2 – Install Forever
npm install forever –g
3 – Install and Start Prerender.io
git clone https://github.com/prerender/prerender.git
Make sure the server starts on 4001 and that PhantomJS is on 4002.
You can edit this file if you want to change the port:
Return to the Prerender folder and start the server using forever – this will help to start the server continuously in the background.
4 – Install Redis server
Add the Dotdeb repositories to your APT sources. To do this, create a new list file in /etc/apt/sources.list.d/ and fill it with the following content:
deb http://packages.dotdeb.org squeeze all
Then you need to authenticate these repositories using their public key:
Next, install Redis using apt-get:
sudo apt–get update
Then enable the Redis service to start on boot:
sudo service redis_6379 start
You should then check the Redis status:
You will get “PONG” if everything is ok.
5 – Make Prerender use the Redis server to cache snapshots
Prerender has an open source module, Prerender-Redis-Cache, that makes it easy to perform this task.
In your local prerender project ( prerender/server.js) run:
Then add these two lines in prerender/server.js :
process.env.PAGE_TTL = 3600 * 24 * 5; // change to 0 if you want all time cache
Restart Prerender by:
And if you want to clean all REDIS cache you can use:
Step 3: Server Configuration
Now we will redirect crawlers to the local Prerender server using a simple .htaccess file.
This htaccess file have contain all the redirect configurations. Note that the .htaccess file needs to be in same directory with your main AngularJS index.html file.
You have now finished all server side installation tasks, so it’s now time to configure the AngularJS App.
Step 4: App Configuration
First open you Angularjs index.html file and:
- make sure you have <base href=”/”> before </head>
- add <meta name=”fragment” content=”!”> between <head></head> (by adding this tag into the page www.example.com, the crawler will temporarily link this URL to www.example.com?_escaped_fragment_= and will request this from your server)
Second, activate HTML5 mode.
In your config.js file add:
This will tell your application to use HTML5 URL format.
var app = angular.module(‘app‘)
Third, you need to manage the meta tags.
To improve the SEO of your app or website you need to have a unique title and description for each page. An AngularJS module called AngularJS-View-Head already exists to fix this problem. This module will help us to change the HTML title and head elements on a per-view basis.
How do you work this in practice?
Start by installing this module using bower.
Next, declare the module as a dependency of your application:
This makes available the directives described in your HTML template.
Finally add the meta tags inside your template.
Step 5: Test Prerender Server
If you’ve followed all of the steps things should be working well. However, better safe than sorry, it’s time to test.
Compare the source of one of your pages with and without _escaped_fragment_ in the URL.
You can check specific routes in your browser and compare :
Step 6: Add a Sitemap
The final step in your AngularJS SEO strategy is to develop a sitemap.
To help search engines crawl your app or website and make sure pages are indexed quickly you must create a sitemap for all of your routes. This sounds difficult to do, however, with proper build processes this can be automated using a tool like Grunt.
(Heads up: we’ll be publishing another post soon explaining just how to automate a sitemap using Grunt!)
Making sure search engines and – through searches – users can find your app or website is essential. The strategy presented in this post is quick to implement, scalable and easy to maintain and, where employed, should help you make the connections you need and win the users you want.