When you perform a website audit to a site, it is normally after completing its designing and on page elements. Before going off page for a site a website audit is must and should for a site to avoid indexing its errors in Google.
This audit helps you to identify all the site errors you should know what you are doing right, what makes to improve and need to identify issues that anything may be going to hurt your website. Whether you are going to perform a website audit on your own site or handling your client site or checking for website related issues this website audit helps you to find them and make it resolve.
Crawling Errors
The first step when you perform a website audit, is to make your website crawl to diagnose all the issues with the site. In order to do this you might have to use some tools such as Screaming Frog’s SEO Spider and Xenu’s Link Sleuth. Let us know more about these crawling tools.
Screaming Frog’s SEO Spider: It is a powerful tool that crawls your site and quickly analyses all the data of SEO perspective. It gives all the information about internal links, 404 errors, Title tags and descriptions, meta keywords, site response time etc. This tool is free for first 500 URL’s and in order to remove this limit you can buy licence addition for just £99 per annum (Excluding VAT). Its gives overall reports into a CSV file or even you can export reports individually.
It is available in all versions!
Xenu’s Link Sleuth: It is a completely free tool that analyses your website for broken links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and java applets. Its gives you all the list of URL’s with title and description and generates reports.
Features:
Simple, no-frills user-interface
Can re-check broken links (useful for temporary network errors)
Simple report format, can also be emailed
Executable file smaller than 1MB
Supports SSL websites (“https:// “)
Partial testing of ftp, gopher and mail URLs
Detects and reports redirected URLs
Site Map
Drawback: It has no Mac version
Google or Bing Webmaster Tools: Another best tool where you can find all your sites errors is Google Webmaster tool. If you haven’t registered with these tools do it now!
The actual analysis starts from here.
Accessibility
Check whether your site is indexing properly in search engines if not that should be cause of your site “robots.txt” which might be not configured properly or it might not exist. Let’s make sure that your site was not facing the following accessibility issues.
1. Robots.txt
The robots.txt file is used to restrict the access of search engines to crawl your website. Manually check your site for robots.txt file and make sure it is not restricting the search engines access. You can check this file by adding extension as /robots.txt to your domain Eg: http://www.example.com/robots.txt. You can also find the status of your site robots in Google webmaster tools.
Below example indicates restricting all search engines to crawl your site.
User-agent: *
Disallow: /
Default Robots.txt file that allows all search engines to crawl your site.
User-agent: *
Disallow:
2. Robots Meta Tags
Robots Meta tags are applies for a particular page or a link of a site to tell search engines whether they are allowed to index or follow a particular page or a link on a website.
Eg:
3. HTTP Status Code
Check with the site URL’s which are displaying errors such as 4XX, 5XX HTTP status codes (includes soft 404 errors). These URL’s are unable to be accessed by search engines as well users also.
Find out these URL’s that no longer exist in your website and make redirections for relevant URL’s. Make sure the redirection includes only 301 however not any 302, meta refresh redirects; javascript based redirections because 301 redirection passes 99.99% link juice to the destination URL.
4. XML Site Map
XML site map is generated to produce road map of your site for search engines. A site map should be propagate according to its protocol. See more at here sitemap protocol
After generating a XML site map it should be submitted to google web master tools for indexing all your pages in sitemap.
Update your sitemap frequently and make sure all pages are listed in it.
Site Architecture
1. Hierarchy
The categories of your site must be shown according to their flow that makes the user easily understandable and access your site; it should not confuse the user.
2. Landing pages
Each and every landing page must be equally set up with respective to home page. Relevant internal linking must be done for passing link juice.
3. Number of category pages
Avoid displaying a huge subcategory list and only keep enough categories listed on their demand.
4. Pagination/Faceted Navigation
Displaying paginations on your site may cause duplicate content issues, make sure you use no follow attributes for your paginations to avoid the same results on search engines.
Technical Issues
1. Proper use of 301’s
Check whether all redirections are done with 301 instead 302. However there should be a limit in using 301 redirections. For more information see Matt Cutts videos explaining about “Is there a limit to how many 301 (Permanent) redirects I can do on a site?” Click Here
2. Use of JavaScript
How popular your site architecture is, doesn’t matter if your site has navigational elements that are not accessible to search engines.
Even search engine bots are getting too smart and intelligent it’s better to avoid navigational elements using flash or JavaScript
3. Use of iframes
Do not maintain your content to pull from iframes it may not rank well in search engines.
Site Performance
Most of the users have very limited attention span to stay in front of a site. If your site loads too long they will definitely leave, which increases the bouncing rate of your website.
Similar way search engines crawlers also have very little time span to crawl your site. A site that loads fast is frequently crawled by search engines that a website that loads too long
Here are some of the tools that check your site loading time and performance such as Google Page Speed, YSlow and Pingdom Full Page Test. Find below excerpt for site speed of my site Dailyseotips.in
Indexability
In order to check during perform a website audit, how many pages your site has been indexed we know the popular command “site:” which is used to check website indexed pages in Google. Almost all major search engines accept this command.
Here you can see the number of pages indexed in Google search engine. However this might be different from the actual total number of pages in your site. There might be many pages that are not still indexed in Google search engine. If you have a sitemap for your site that has all links of your site from the development stage you might know how many pages are still pending for indexing.
This indexing scenario can be discussed in three ways.
Number of indexed pages nearly equal to total number of pages in site map: Here your site is performing well and search engines are successfully crawling & indexing all your website pages.
Indexing count is very low when compared to actual count: In this scenario some of your web pages are not crawling by search engines. You need to figure out the actual cause for the issue whether your site is being penalised by search engines or not.
If indexing count is high when compared to actual count: Here your site is arising duplicate URL’s and having same content on multiple URL’s. If you suspect there is a duplicate content issue Google’s command “site:” also helps in finding those suspicions.
Simply append “&start=990” at the end of the search URL as shown below.
And then look for the Google warning message at the bottom of the page which looks similar to this.
Also check whether all important pages of your site have been indexed or not. If not make sure you try to index them or investigate the cause for penalties.
Brand Searches
Here when you search your website name in Google you should see first your site on search results. Whether it might be your site name or brand name your site must come in top position. At any case your site is not getting shown up then it’s time to investigate for penalties.
Penalties
If you suspect your site got penalised, check for the root reason for being penalised. Generally Google will send a message to all webmasters for the reason you got penalised. Check for the messages in your webmaster tools whether you got any message. If you have received a message from Google then you are almost done without any investigation.
Fix the issues for being penalised and request reconsideration to Google. For more information check this support for Reconsideration Requests