game-over-google

Why Does Google Hate Me? 11 Simple Site Checks To Get You Back In The Good Books

Do you feel as though your SEO efforts just aren’t paying off? Do you spend all your time creating great content, building links and fine-tuning your on-page SEO only to feel as though your website is invisible to Google?

Don’t panic. In this article, I’m going to go through some simple steps you can take to work out whether your website has technical problems that might mean Google isn’t indexing it, or crawl errors that are stopping Google accessing parts of your site. We’ll also explore whether parts of your site could be viewed as spammy by Google – for example, if you have duplicate content.

It may be that the first clue you’ve had that there’s a problem is a massive drop in your rankings on Google. Help is at hand!

1. Is Google Seeing Your Website?

The first step is to establish whether Google is even seeing your website. There can be several reasons why it might not be, which I’ll touch on below.

Go to Google.com or Google.co.uk and type in site:yourdomain.com into the search bar. Make sure you don’t add in http:// or www. This will bring up a list of all the pages from your website that Google has indexed. In the screenshot below, you can see that I have run a search for my own website – site:seo-plus.co.uk – for which Google has returned 187 indexed pages.

How many results does your search return? Do the number of indexed pages reflect the number of pages on your website? Is there a big discrepancy?

If a page hasn’t been indexed by Google, then to all intents and purposes, it doesn’t exist for the search engine and you won’t get traffic to that page via organic searches.

Site screenshot

If you run the site:domain.com search and realise that a huge number of pages haven’t been indexed, there are several steps you can take to find out why.

2. Have You Had A Google Penalty?

First, go to your Google Webmaster Tools and click on Search Traffic > Manual Actions. Is there a penalty listed there? Problems highlighted under Manual Actions can be site-wide or affect specific pages, and might include transgressions such as ‘unnatural’ paid-for web links or cloaked text. In other words, SEO techniques that would probably be viewed as ‘black hat’.

You can read more about Manual Actions here. If any problems do appear under Manual Actions, it is important that you take steps to fix them.

Once you have fixed the problem(s) identified on the Manual Actions page, you can ask Google to reconsider indexing your website by submitting a Reconsideration Request.

3. Could You Be Restricting Google’s Access To Your Site?

It sometimes happens that people unintentionally block the search engines from indexing their site, either through the robots.txt file or in the settings of the site (especially if it’s a WordPress site).

To check whether you might have blocked the search engine robots from crawling your site, you can type in yourdomain.co.uk/robots.txt in the search bar. If the following text appears, then search engines can’t crawl your site:

User-agent: *

Disallow: /

Another issue to look at is whether you have blocked Google from crawling a specific directory of your website. Again, you can find this out by running the yourdomain.co.uk/robots.txt search. If you see something like the text below in the results, it means Google is unable to crawl the named directory:

User-agent: Google

Disallow: /products/

Google Webmaster Tools also lets you see your robots.txt information. Simply log in and click on Crawl > Robots.txt Tester. This will show the same information as the yourdomain.co.uk/robots.txt search.

In addition, you can use Google Webmaster Tools to check whether there are any crawl errors that are preventing search engines from crawling your site. Go to Crawl > Crawl Errors to see this information.

You can see whether you have marked a page as ‘noindex’ by checking the source code of each page. To do this, right click on the web page in question and select ‘View page source’ from the pop-up menu. This will bring up the source code for that particular page. If the following text appears anywhere in the source code, search engines will not be able to index the page:

<META NAME=”ROBOTS” CONTENT=”NOINDEX”>

 

4. Ticked The ‘Search Engine Visibility’ Box In WordPress?

If you have a WordPress website, especially if it was developed by a web designer on your behalf, it’s worth checking whether the settings discourage search engines from indexing your website. You can find this information in your Dashboard under Settings > Reading. If the Search Engine Visibility check box is ticked, make sure you untick it.

se-visitbility

This box is sometimes checked by web developers when they’re building the site and the check box is left ticked as an oversight.

5. Do You Have Too Many Pages Indexed by Google?

If, when you run the site:yourdomain.com search, you discover that you have thousands of irrelevant pages indexed by Google, this could be harming your rankings too.

The most common causes of too many indexed pages are your site being hacked or a plug-in that is creating new pages with no content.

If your website has been hacked, it may be apparent when you run the site:yourdomain.com search. You may, for example, see page after page about Cialis or Viagra, or some other spam content, all using your domain. If you do see a list of search results for your site that feature spam content, this means that Google has indexed the spam pages and there’s probably a black mark against your site.

I found a few useful articles about what to do if your website is hacked. You might find these helpful:

If it becomes apparent that a plug-in is generating empty pages with identical meta data, you are advised to disable the plug-in, use your robots.txt file to restrict access to these pages or delete them from your site. You should also use a 301 redirect to send people from the deleted page URL to a useful live page on your site.

6. Do You Have Duplicate Content?

In its quest to provide a good user experience, Google is clamping down on duplicate content on websites. This is because it doesn’t want to return pages in its search results that all say the same thing.

To check whether there is duplicate content anywhere on your website or, indeed, if it has been copied to another website, you can use a free tool called Copyscape (www.copyscape.com). Simply enter each web page address (unfortunately Copyscape won’t search your whole site in one go) and Copyscape will tell you if there’s a copy of it anywhere on the web.

You might also want to try Copyscape’s new tool called Siteliner (www.siteliner.com), which will scan 250 pages of your website using its free tool or up to 25,000 pages using the paid-for Siteliner Premium service.

One problem that Siteliner highlights is that, if you use categories and tags to organise your blogs posts, they could be leading Google to see lots of duplicate content on your site. Why is this?

Let’s imagine we’ve written a post called example.com/website-SEO-tips – this is the main URL, the one you want Google to index. However, to enhance the user experience, we have used a category (SEO) and tags (websites, SEO tips, Google) to give website visitors other ways to find this content and related articles. Therefore, the same article could be found using several URLs, e.g.:

  • http://example.com/website-SEO-tips (the main URL)
  • http://example.com/blog/page/3 (the third page of your blog if you’ve posted a lot of articles since this one)
  • http://example.com/category/SEO (a category archive – one of these for each category assigned)
  • http://example.com/tag/websites (a tag archive)
  • http://example.com/tag/SEOtips
  • http://example.com/tag/Google
  • http://example.com/author/yourname (an author archive)

If your posts are searchable by date, you may even have date archives too, adding to the number of URLs. Even using the example above, this means that that there are, in Google’s eyes, seven pages of duplicate content on your website. The dilemma is that you want people to be able to follow those URLs when they’re on your site – they can help with navigation and let people search for related articles. However, you do not want Google to index every URL.

If you have a WordPress website and use the Yoast WordPress SEO plugin, you can change the settings to tell Google not to index your categories, tags or attachments. Simply click on the Yoast plugin on your Dashboard (this may appear as SEO in the left-hand column), then choose Titles & Meta. This will bring up a screen with various tabs – click on Taxonomies.

Taxonomies

As you can see on the image above, you can then tick Meta Robots: noindex, follow to prevent the search engines from crawling your categories and tags. The one thing to consider before you do this is whether you have some tag or category pages that are driving a high amount of traffic to your website. If you have, you may want to consider a different approach, such as choosing to only ‘noindex’ certain tags.

If you have a WordPress site and use the Yoast WordPress SEO plugin, you can do this by clicking on Posts > Categories or Posts > Tags in the back-end of your site, then going into the particular tag and choosing the Always index option.

Yoast

If you are not using WordPress, you may need to add the following code to the head of each tag or category page that you don’t want Google to index:

<META NAME=”ROBOTS” CONTENT=”NOINDEX”>

 

Your source code will potentially then appear as:

<html>

<head>

<title>…</title>

<META NAME=”ROBOTS” CONTENT=”NOINDEX”>

</head>

 

Because different website platforms can be edited in different ways, you may need to search online for information specific to the platform you used to create your site for further advice about creating ‘Noindex’ tags and categories.

7. Have Large Numbers of Pages Recently Been De-indexed?

If you decide to ‘noindex’ your tags and categories, you would expect to see the number of indexed pages for your website drop the next time you run the site:yourdomain.com search. This would also happen if you’ve recently deleted lots of pages from your website.

However, sometimes people notice a massive drop in their indexed pages without any explanation. This may mean that Google has deindexed some pages, having decided for whatever reason that the pages don’t deserve to be indexed any more.

You can check whether a large quantity of pages have been deindexed recently by logging in to your Google Webmaster Tools and clicking on Google Index>Index Status. If there is a sudden drop in the number of indexed pages on the graph, and you have not deleted or deindexed any pages recently, this could indicate a problem.

8. Could Your Website Have Been Affected Google’s Algorithms Changes?

On Tuesday 21st April 2015, Google rolled out its latest algorithm to prioritise mobile-friendly websites in mobile searches. This may impact on thousands of small businesses with non-responsive websites and see them take a significant hit when it comes to website traffic and Google rankings.

Google is constantly refining its algorithms to return the best search results; in fact, algorithm changes can be one of the most common reasons for deindexed pages on your website.

To check whether your website may have been affected by the latest algorithm changes, you can try using a tool like Barracuda’s Panguin Tool:  http://www.barracuda-digital.co.uk/panguin-tool/ – this will let you see at a glance whether your website traffic went up or down after an algorithm change.

9. Do You Have Spammy or Unnatural Back-links?

Unnatural, spammy backlinks can result in a significant drop in your Google rankings. Whereas high quality backlinks from reputable, high authority websites are still an essential part of any website’s SEO efforts, low quality backlinks can tarnish how Google views your site. Unnatural, spammy backlinks can include:

  • Paid links
  • Interlinking between controlled domains
  • Site-wide links (such as footer links)
  • Links from low quality directories
  • Links from low quality websites
  • Reciprocal linkings
  • Overuse of keyword-rich anchor text
  • Unnatural link schemes

If Google has detected unnatural links on your website, you may receive a warning via your Google Webmaster Tools.

Unfortunately, there is no single tool to establish whether you have ‘bad’ backlinks to your website. Tools that can help you understand your link profile include:

It’s a good idea to collect all the backlinks these tools reveal in an Excel document so that you can begin to explore which ones might be harmful. You then need to contact the websites you’ve identified to request that they remove the backlink. Often people will ignore your request but, before you can move towards getting links disavowed, Google will want to see that you have taken action to get the links removed.

Unfortunately, spammy websites may not actively advertise their contact details. You may be able to find information by using the Whois Lookup tool – whois.domaintools.com. Simply type in the domain that you want to contact and see whether any useful contact details are revealed.

Once you have submitted link removal requests (keep them polite and list any links you would like removed), you can move on to submitting a ‘Disavow’ request to Google. To do this, go to www.google.com/webmasters/tools/disavow-links-main and click on ‘Disavow links’. You will then be walked through the process.

Koozai are currently offering a very helpful whitepaper on backlink analysis and disavowal. You can find the link here: www.koozai.com/resources/whitepapers/the-complete-guide-to-backlink-analysis-and-removal.

10. Is Your Website Too Slow?

Since 2010, Google has been using page speed as a ranking factor. If your website pages are slow to load, this could be impacting on your rankings.

You can test your page speeds using a free tool such as http://tools.pingdom.com/fpt/ or Google’s PageSpeed Insights, which you’ll find in Google Webmaster Tools under Other resources > PageSpeed Insights. The latter will identify how your page speeds could be improved and steps you can take to fix your website’s performance.

11. Is Your Website Mobile Friendly?

As more than 50% of Google searches now take place on mobile phones, Google is keen to list mobile-friendly websites in its search engine results pages for mobile searches. As mentioned above, the latest algorithm, rolled out on Tuesday 21st April 2015 and dubbed ‘mobilegeddon’, has been designed to boost the rankings of relevant websites with a mobile-friendly label, while downgrading sites that don’t offer a mobile friendly experience.

If you have noticed a sudden drop in your rankings, you may need to look at whether your site is mobile friendly.

You can see whether Google has given your website a mobile-friendly label by taking the mobile-friendly test at: https://www.google.com/webmasters/tools/mobile-friendly/

If you run a search for your own business in Google using a mobile phone, you should also be able to see the mobile-friendly label if Google has given it to you.

If your website isn’t mobile friendly, you may need to consider the following issues:

  • Is the design responsive? i.e. Does it resize and adapt to the device on which it’s being viewed?
  • Is the text large enough to read?
  • Are the call to action buttons big enough and easy to use?
  • Can you touch the links and navigation menus easily without accidentally going to the wrong page?
  • Does the website load up quickly enough on a mobile device?
  • Does the website offer a comparable experience to someone viewing it on a desktop or laptop?
  • If you don’t have a responsive website, do you have a mobile version of the site?

This blog has only just scratched the surface of some of the technical issues that can affect your SEO efforts. If it feels like Google just isn’t giving you enough love, it’s definitely time to take action and get back in the search engine’s good books.

Remember too that creating fresh, high quality content aimed at your target audience will go a long way towards improving your visibility on all the major search engines.

Now Read:

 

inbound-banner

Hazel Jarrett is a SEO Specialist simplifying SEO to help you rank at the top of Google. With a passion for SEO and a reputation for getting results, find her at www.seo-plus.co.ukon Twitter (@SEOPlusMore), Facebook (facebook.com/SEOplus), Google+(google.com/+SEO-PlusCoUk) or LinkedIn (uk.linkedin.com/in/hazeljarrett)

[related_posts_by_tax posts_per_page="5" format="thumbnails"]
Tagged with: , , , , , ,
Posted in Google

Leave a Reply

Your email address will not be published. Required fields are marked *

*