3 Essential Tools to Boost your React App’s SEO

That’s what I thought…Creating an Isomorphic React App is not always necessary for more application-based apps (versus blogs or e-commerce sites).

If our app is something like a game, a functional program, or a social networking type site, we might not need to go to the trouble of setting up Server Side Rendering.

If this is the case, our site’s purpose is not to glean as many Search Engine clicks as possible.

However, we do still want it to be found as well as possible, so let’s focus on improving our app’s SEO without using an Isomorphic app (no SSR).

So how do we improve our app’s Search Engine ranking already!?This is the moment you’ve been waiting for.

The reason you’re here: To learn about three solutions to React’s inherent SEO problems:React RouterReact HelmetFetch as GoogleSOLUTION: Tool #1, React RouterUsed by most React developers, React Router is a library for handling routing of a React app.

Along with it come two important ways of handling routing: HashRouter and BrowserRouter.

Without going into too much of the specifics, HashRouter is more backward compatible, and historically, it has not given Google a new URI to index as a new page (or view) for each route.

Google seems to be making some improvements in the area of hashtags, indexing some, but they also said not to use HashTags in URLs for pages you want to be indexed by Google.

So try not to!In my demo app, the component I render using HashRouter is the following:import React, { Component } from 'react';import {HashRouter} from 'react-router-dom'import HashLinks from '.

/hash-links'class UsingHashRouter extends Component { render() { return ( <div> <h1>Hash Router</h1> <HashRouter> <HashLinks /> </HashRouter> </div> ); }}export default UsingHashRouter;Notice, all we did was import {HashRouter} from React Router.

Then, inside the HashLinks component, we use Switch and Route (also from React Router) to render the components, based on the URL.

:import React, { Component } from 'react';import {Switch, Route, Link} from 'react-router-dom'import Dogs from '.

/dogs'import Cats from '.

/cats'class HashLinks extends Component { render() { return ( <div> <Link to="/dogs/">View Dogs</Link> | <Link to="/cats/">View Cats</Link> <Switch> <Route path='/dogs' component={Dogs} /> <Route path='/cats' component={Cats} /> </Switch> </div> ); }}export default HashLinks;BrowserRouter, while less accessibility-friendly (doesn’t support IE 9 and lower), it creates a new URI for each Route, without the hash tag.

Furthermore, it still allows Google to index “rich snippets” using HTML element IDs.

It’s very similar to HashRouter, except anywhere that HashRouter is defined, replace it with BrowserRouter:import React, { Component } from 'react';import {BrowserRouter} from 'react-router-dom'import BrowserRouterLinks from '.

/browser-router-links'class UsingBrowserRouter extends Component { render() { return ( <div> <h1>Browser Router</h1> <BrowserRouter> <BrowserRouterLinks /> </BrowserRouter> </div> ); }}export default UsingBrowserRouter;And inside the BrowserRouterLinks component:import React, { Component } from 'react';import {Switch, Route, Link} from 'react-router-dom'import Dogs from '.

/dogs'import Cats from '.

/cats'class BrowserRouterLinks extends Component { render() { return ( <div> <Link to="/routing/browser-router/dogs/">View Dogs</Link> | <Link to="/routing/browser-router/cats/">View Cats</Link> <Switch> <Route path='/routing/browser-router/dogs/' component={Dogs} /> <Route path='/routing/browser-router/cats/' component={Cats} /> </Switch> </div> ); }}export default BrowserRouterLinks;SOLUTION: Tool #2, React HelmetMost React developers use React Router, but less know about or use React Helmet.

This is a library that allows us to set the HTML metadata in the header of any given component.

The header tags we are most interested with here look like this:<head> <title>This is Where You Set The Page Title</title> <meta name="description" content="This is an example of a meta description.

This will often show up in search results, though many search engines generate their own.

"></head>The above code will produce the following SERP (Search Engine Results Page) listing:An Example SERP (Search Engine Results Page) listing produced by the above code.

By default, React has no way to set meta data or title tags in the header of each component.

Thus, this is how my Demo App’s 2 SERP listings look without changing anything:Notice the two listings are virtually IDENTICAL (except for “cats” in the URL, and a little different content).

Since Google does its best (pretty good!) to generate these from the content of your pages, the content may change.

But the “title” (in blue in the SERP listing) will not change, and the “description” tag is still important for SEO purposes.

Thus, to change the content inside the <head> tags, we will use React Helmet.

With this npm package, we can set our own <title> and <meta> tags, thus creating something like this:How to Install React HelmetSo let’s get this running: First things first, install React Helmet via npm:npm install react-helmetThen, it’s as simple as adding our desired tags inside the <Helmet> component inside our exported JSX.

Here’s an example:import React from 'react';import {Helmet} from 'react-helmet'const App = () => (<div> <Helmet> <title>Here's the Title!</title> <meta name="description" content="This is what you want to show as the page content in the Google SERP Listing" /> </Helmet> <h1>My Amazing React SEO Page</h1> <p>Hello World!</p> <ChildComponent/></div>)Don’t forget to import {Helmet} at the top of the file!Now, behind the scenes, Helmet sets our title and description tags.

We just need to put it inside the JSX I’m exporting, and Helmet puts it inside the head for us.

SOLUTION: Tool #3, Fetch As GoogleMany times with React, there will be issues with the googlebot fetching our pages.

If our JavaScript takes too long to load (i.


asynchronous calls with large data sets) or if we don’t have our polyfills set up correctly, the googlebot will not be able to render the content, because it can’t even run the JS.


Why is this?The googlebot uses a different version of the V8 engine from their very own Chrome browser (whaaaat?), meaning just because your javascript runs in the browser doesn’t mean the googlebot will be able to.

Disappointed?.Me too, but never fear: There is a solution!Fetch As GoogleEnter the secret weapon of SEO experts(chuckles devilishly)OK, SEO experts need a lot more than this to be classified as true masterminds, but that is beside the point…This handy little tool “[…] enables you to test how Google crawls or renders a URL on your site.

” To get set up with Fetch as Google, we click “ADD A PROPERTY” and enter our URL to add it.

This will take us to a new page that gives us a file to add to our site.

This needs to be at the root, so the /public/ directory is a great spot for it(depending on how your app hierarchy is set up).

Once we’ve uploaded the file and Google knows you own it, then we have access to a plethora of tools, one of which is (you guessed it) Fetch as Google:Here, we simply add the URI that we want to test, and click “FETCH AND RENDER”.

After clicking the checkbox, we are taken to a page that shows the difference between how the user sees the page and how Google sees it.

Cool, huh?We can use this for any URL in our app, so once you’ve “connected” your site, you’re free to start testing!Here, there are basically three scenarios:Neither the box on the left or right have any content (uh-oh)The left box has less/different content from the right box (close, but no cigar)They both have the same content (great! …kind of).

If we get either of the first two, we should try to do our best to make it so that Google can see what we see when we open a browser and go to that URI ourselves.

However, even if we are seeing the same thing in both boxes, it might not be what you see when you open your own browser window.

On my demo app, I’ve created an example of how data loaded via asynchronous calls may not be visible to the googlebot.

Actually, all I did was load nothing before the component mounts, then once it does, load 1 message and set 10 timeouts at different intervals.

This is to show what happens if our data takes longer to load (anywhere form 500ms to 10 sec).

Here’s the code if you’re curious:class IncrementalLoading extends Component { constructor() { super(); this.

state = { message1: '', message2: '', message3: '', message4: '', message5: '', message6: '', message7: '', message8: '', message9: '', message10: '', }; } componentDidMount() { this.

setState({ message1: 'Message 1(immediately after component mounts): Googlebot will always crawl' }) setTimeout(() => { this.

setState({ message2: 'Message 2 (500ms): Googlebot will almost certainly crawl' }) }, 500); setTimeout(() => { this.

setState({ message3: 'Message 3 (2 sec): Googlebot will probably crawl' }) }, 2000); setTimeout(() => { this.

setState({ message4: 'Message 4 (3sec): Googlebot less likely to crawl' }) }, 3000); setTimeout(() => { this.

setState({ message5: 'Message 5 (4 sec): Googlebot may or may not crawl' }) }, 4000); setTimeout(() => { this.

setState({ message6: 'Message 6 (5 sec): Googlebot MIGHT crawl' }) }, 5000); setTimeout(() => { this.

setState({ message7: 'Message 7 (6 sec): Googlebot probably will NOT crawl' }) }, 6000); setTimeout(() => { this.

setState({ message8: 'Message 8 (7 sec): Googlebot almost certainly will NOT crawl' }) }, 7000); setTimeout(() => { this.

setState({ message9: 'Message 9 (8 sec): Googlebot definitely will NOT crawl' }) }, 8000); setTimeout(() => { this.

setState({ message10: 'Message 10 (9 sec): Googlebot is on to the next page by now.

' }) }, 9000); setTimeout(() => { this.

setState({ message11: 'Message 11 (10 sec): If you're seeing this, you're definitely not Google.

' }) }, 10000); } render() { return ( <div> <h1>Data Loaded</h1> <h2>Simulates data that takes differing amounts of time to fetch.

</h2> <h4>{ this.


message1 }</h4> <h4>{ this.


message2 }</h4> <h4>{ this.


message3 }</h4> <h4>{ this.


message4 }</h4> <h4>{ this.


message5 }</h4> <h4>{ this.


message6 }</h4> <h4>{ this.


message7 }</h4> <h4>{ this.


message8 }</h4> <h4>{ this.


message9 }</h4> <h4>{ this.


message10 }</h4> <h4>{ this.


message11 }</h4> </div> ) }}Below, you can see a rendering of my own page:As you can see, there are 11 messages on my page, but only 6 of them were crawled by Google.

If Google can’t see them, that means they will not be indexed, so they will not show up in our search results.

(sheuwt, man)There are other (slightly more complicated) ways around this, one of which is Server Side Rendering, but I will not get into that right now.

For now, the important thing to remember is that, when doing anything asynchronous, we must be careful what is loaded when, since how long it takes will have a big effect on how the googlebot sees (or doesn’t see) our content.

Without SSR, we can still make this better.

Wherever possible, we need to load all important-to-SEO-content first before any asynchronous calls are made.

For example, on my SEO Demo App’s cute animals page, I render the titles of the pictures before I load the pictures themselves, so when we look at how Google fetches this page…We see that the titles are ALL loaded, even though the data of messages 7 and onward are NOT loaded.

This way, Google will index the titles, hopefully gleaning more link-juice from the keywords in the titles.

That way, people can find our cute animals!(awwwww….

)What Next?If your app does depend heavily on SEO, and you’re ready to take it to the next level, Try out SSR (Server Side Rendering) to create an Isomorphic app.

There is far more to discuss in the way of SEO, but for right now, we have just stuck to the three tools we’ve used:React Router: Gives us a unique URL for each of the views in our app.

React Helmet: Allows us to set title, description and other header tags.

Fetch as Google: Helps troubleshoot Google’s ability to view our content.

Just using these three essential tools, you will be able to boost your app’s ranking in Google and other search engines.

I find Search Engine Optimization challenging and fun.

It’s akin to “buying” real estate: The more you work on it, the more traffic to your site you “own”.

It’s challenging at times, but chip away at it, and you’ll eventually get the results you’re looking for!.. More details

Leave a Reply