SEO Analysis Explanation

This page explains each SEO items in the SEO analysis we carried out.

 

Total pages on the site

We use both crawling tool (screaming frog) and Google index to find out how many pages are on the site.

It is common that page index by Google exceeds the pages we found using crawling tool.

This is because we deployed the crawling tool from home page only whereas Google enters the site from multiple locations and not only from home page. In addition Google can also find pages on the site by following back links from other sites and not only from crawling home page.

What we like to see is small differences in the number, because if the difference is large, it indicates the linking structure of the site might be broken somewhere because there are pages that Crawl could not reach.

General site architecture

In this section we investigate high level overview of how the site is structured.

Is the home page 302 redirected?

Some sites redirect the root domain URL to an existing home page without utilizing 301 redirect but instead a 302 redirect or sometimes a 303 redirect.

This prevents all the back link juice accumulated by the root domain URL being passed to the existing home page.

For example: www.phillips66.com

The domain is 302 redirected to http://www.phillips66.com/EN/Pages/index.aspx which means all the back link juice gained by the root domain URL is not passed to the home page.

As you can see from the image below (Majestic data), 83,297 links are pointing to the domain!

phillips66 back link

Number of pages with excess links (more than 100)

Traditionally, Google could only process 100 links per page, and this is why it became the bench mark for how many links a page should have.

Matt Cutt has clarified over the years that this has changed for Google and now it can process more than 100 links per page (see videos below). However, from linking perspective, more links on a page will dilute the link juice which means even if you have optimized your site for internal linking, having way too many links on the page will be a negative move.

 

Do all pages have a unique URL?

URL is one of the ranking factors so having a unique, keyword containing URL will definitely help your site rank higher.

Example for non-unique URL: www.examplesite.com/page/1294

Do site URLs appear static?

Even if a site has unique URLs for each page, depending on how the site is built, additional dynamic parameters may be added to the URL. In some cases this could cause duplicate content.

If non-static, how many are appearing on Google index?

Google has gotten better and better at handling duplicate content and stopped ranking them. This does not mean Google is ignoring these duplicate content. By using the search operators, we can find out if these are duplicate content caused by the dynamic URL.

Example below shows using the search operator, I was able to find these two dynamic URLs on Honeywell site and they are essentially duplicate content.

http://www.asia.security.honeywell.com/Pages/Results.aspx?k=NVR&amp&r=companies%3D%22AQlIb25leXdlbGwJY29tcGFuaWVzAQFeASQ%3D%22+AND+companies%3D%22AQNVUFMJY29tcGFuaWVzAQFeASQ%3D%22&start1=1&IsPost=TRUE

http://www.asia.security.honeywell.com/Pages/Results.aspx?k=NVR&=amp&IsPost=TRUE&dupid=331379857679804393&r=companies%3D%22AQNBQUMJY29tcGFuaWVzAQFeASQ%3D%22

dynamic URL

Any sub-domains?

Sub-domains can be used for many kind of purposes. For example, you might have hired a developer who thought using a sub-domain (something like “developer.example.com”) for  a development site but without applying no-index tag and other SEO practices to prevent the site from getting indexed by Google. So now not only Google has indexed a duplicate content (if it happens that the development site is copy of your live site), your private development site is now opened to the public.

This is only one of the many reasons why it is good to know.

Navigation

Navigation are one of the most important links a site can have. This could make or break your SEO efforts.

Are site links coded in HTML, and not javascript?

Although Google can now execute more javascripts than before, it is still ideal to have hard coded links rather than javascript (or flash, iframe) based links.

Is main navigation text-based?

Sometimes a site might use image based navigation links and this is better than javascript based, but text based is the best.

Breadcrumb navigation?

Not only this will help users know where they are, search engines also utilize this information.

bread cramble navigation

Duplicate Content

We investigate three common types of duplicate contents.

1) http:// vs https://

Example below shows how http://floridaparrotrescue.com/ has two versions of the identical page.

http

http version

 

http and https

https:// version

2) www. vs non-www.

ryder with www.

Ryder used truck site with www. version

 

ryder no www.

Ryder used truck site without www. version

 

3) UPPER CASE vs. lower case

Mixture of upper case and lower case will not only confuse users, it will also dilute back links and could cause large number of 404 issues.

upper case and lower case url