Last Updated: 11/12/2015
Bookmark the SEO Cheatsheet I use everyday to assess websites…
So you have just set up brand new site or are keen to optimise an existing one. Conducting a (regular) full website audit will ensure that your site consistently performs well in the search engines, uncovers any risky areas that could get the site penalised and makes sure it complies with the latest search engine guidelines.
The On-Page SEO Cheat Sheet below covers the important areas that will help your site rank optimally for relevant search terms and ensure you offer a heightened user experience. Work your way down the list, checking off as you action each item…
Set Up & Configure Google Analytics
Setting up this powerful software is one thing, configuring it to your needs with conversion tracking, internal search tracking, filters and custom dashboards is quite another.
If you are serious about improving exposure and performance of your site installing an Analytics package such as Google Analytics (free) should be one of the first things you do. This allows you to track a number of things including (but not limited to):
- Traffic numbers, time on site, pages per visit and bounce rate
- Mobile traffic figures and devices
- What keywords are bringing in traffic
- What social networks are referring traffic
- What pages are performing well and which aren’t
- How responsive your site is and how quickly it serves up pages
- What elements your audience interact with most/least
- What products/pages/keywords lead to sales
When setting up a new Analytics profile there are a few things you’ll want to ensure you do:
- Filter out your own IP address(es) so you are not influencing site traffic with your own views
- Enable Demographic / Interest data tracking
- Configure any goals you want to track that impact your business
- Set up e-Commerce tracking if you’re an online store
- Set up internal site search tracking if your site has a search bar
- Link Analytics to Webmaster Tools
- Set up any appropriate custom dashboards and reports
- Set up any relevant automatic alerts
Set Up Google Search Console & Bing Webmaster Tools
No website should be without Google Search Console (formally Webmaster Tools) helping you identify technical issues with your site and helping you sort out any problems that could be limiting exposure.
- Submit an XML Sitemap
- See basic search query data
- Obtain a snapshot of search engine crawl errors
- Configure various aspects of your site
- Block URLs
- Crawl brand new pages
- See what external links are pointing to your site
- Identify issues with Meta Data
Link: Google Webmaster Tools (opens in new window)
Link: Bing Webmaster Tools (opens in new window)
More information on Google Webmaster Tools here: 5 Free Google Tools Every SEO Should Use
Defining keywords and hubs of keywords and knowing how to incorporate them into content and site elements is make-or-break for any SEO campaign.
- Use Google’s Keyword tool, WordTracker’s Keyword Question Tool as well as niche forum/question sites to find inspiration and real-world appeal for keywords.
- Google uses LSI (Latent Semantic Indexing) or “co-occurrence” to connect related words. Use a variety of terms to not only attract a wider audience but also make your content look natural.
- Important keywords can be used in page title, meta description, header tags, image alt tags, copy, URLs, anchor text, navigation labels and breadcrumb trails.
Internal Site Search Tracking
When Google took away keyword-level data in Analytics it was a huge blow for SEO as we knew it. Understanding what users type into your search bar can surface problem areas, popular products/content and more.
If you have on-site search functionality, set up internal site search tracking through Google Analytics to discover what your customers are looking for while on your site and whether they are impressed with what they find.
HTTPS & SSL
Google recently stated that secure sites will receive a small rankings boost. While it is early days, the likes of Wikipedia and Twitter are making moves towards HTTPS and the smart money is on those who follow suit.
Stop Blocking CSS & JS Files
At the end of July 2015, webmasters received warnings to remove scripts blocking certain files in their robots file. Google wants access to everything.
Domain Registration Length
Leasing your domain and server space for longer periods suggests to Google you are in it for the long haul which sends a trust signal.
Hiding your site registration information suggests you have something to hide and are potentially a spammer.
Grammar & Spelling
Quality content starts at correct spelling and grammar.
Affiliate Links, Pop-Ups & Excessive Ads
Compromising the user experience by throwing copious distractions in their face is a big no-no in Google’s eyes
Contact Page, T&Cs & Privacy Pages
These show Google that you are legitimate
Server Speed, Reliability & Uptime
One of the most important ranking factors is how fast your site content loads (on all devices) and that it Loads. Every. Time.
Because Google takes user experience very seriously when ranking a site, a quick response time and reliability are important. You can see your average response times through Google Analytics (Content > Site Speed > Overview). As each site and industry is different it is difficult to determine an ideal load time, however through Analytics you can determine change-induced fluctuations and ultimately you should constantly aim to cut load-speed. The following are key areas:
- Upgrade server – you pay for what you get, and server location helps (i.e. don’t purchase hosting in USA if you are a UK based site)
- Cut down image file size (use lower quality images where possible)
- Externalise scripts where possible
- Minimise redirects and avoid linking to pages which redirect elsewhere
- Reduce/remove redundant page code (including commenting)
- Remove broken links
- Specify image dimensions so that if an image doesn’t load quickly, the rest of the page can still be rendered properly
- Use appropriate image sizes for the task (e.g. don’t use full-size images with on-the-fly resizing for a thumbnail as the full-size image will still need to be loaded)
- Check out this post on SEOmoz for even more ideas
Keep an eye on server reliability as sites that are consistently unavailable when search bots go to crawl them can hinder your SEO efforts. Use an “Uptime Monitor” tool to alert you by email if your site goes down – reassess your hosting package if the site is regularly going down.
The search engines are smart enough nowadays to understand when webmasters are trying to manipulate them and will penalise you accordingly.
Custom 404 Page
404s happen, they just do. What’s your contingency plan? Get a helpful, customised 404 page in place (y’know, just in case).
Unless you upload a custom 404 page, users will be presented with the standard (and unhelpful) 404 error page if you have broken links which could deter them from visiting your site. It is advisable to design and upload your own 404 page that helps reassure visitors and get them back on track. A good 404 error page should have the following elements:
- An apology and clear explanation why they are not seeing what they expected
- Consistent & familiar styling/navigation
- Helpful links to main site pages
- Internal search functionality
- A polite request and easy way to report the issue
Distilled have an excellent example of a 404 error page.
Alternatively you could do something a little different and add this little widget from notfound.org which puts your 404 page to good use by displaying a random missing child in your area – so cool! Check it out!
- Design the page and name it “notfound.html” or something similar.
- Upload the file to the root directory
- In the .htaccess file (what?) add the following code: “ErrorDocument 404 /notfound.html”
Every site should have an XML sitemap in place. Is yours up-to-date, error-free and findable by the search engines?
An up-to-date and complete XML Sitemap is critical to help the search engines find and index new/deep pages within your website. If your CMS or back-end system doesn’t automatically generate an XML sitemap on a regular basis it is recommended that you use a free Sitemap Generator Tool. This tool automatically crawls and generates the sitemap for you ready to upload to your root directory. Don’t forget to notify the search engines of its existence through Google/Bing Webmaster Tools.
- Particularly useful for e-Commerce websites that are continually evolving and updating product lines
- Google Webmaster Tools: Optimization > Sitemaps > Add/Test Sitemap
- Bing Webmaster Tools: Configure My Site > Sitemaps > Submit a Sitemap
- “Number of pages indexed” is a good indicator of site quality. If only a small portion of your total pages are indexed you should assess the quality of content on your individual pages or whether so many pages should indeed exist.
- XML Sitemaps should be updated and re-uploaded on a (at least) monthly basis depending on how often your site changes.
- Immediately update the sitemap after a navigation shake-up or site migration.
Note: The free version of the Sitemap Generator Tool only generates up to 500 URLs. If you have a large site, consider using the paid version or find an alternative.
#ProTip: Don’t know if a sitemap already exists? Conduct a Google search: site:[sitedomain] filetype:xml
If you have videos on your site – whether YouTube embeds or locally-hosted – you should have a video sitemap to enable video thumbnail rich snippets.
If you utilise videos on your website (more on this later) you should consider uploading a video sitemap that tells the search engines of the existence of videos on your site. Whether you embed YouTube videos or unique custom videos you can obtain a video thumbnail rich snippet in the SERPs which commands a much greater CTR (click through rate).I am yet to find a free Video Sitemap Generator tool but by following the instructions in the Video SEO Cheat Sheet you can easily produce your own – please be aware this can take some time if you have a lot of videos!
A improperly configured robots.txt file can be disastrous to your SEO efforts. Check yours for issues and cross-reference with documentation on best practices if necessary.
The robots file is, among other things, used to block URLs from the search engine spiders and give other crawling instructions. Check that your robots.txt file is not blocking important pages.Remember:
- Don’t use robots.txt files to hide content or sensitive information
- Spiders (particularly those with a malicious intent) can choose to ignore robots files
- Even blocked pages can be indexed to a limited degree
- Robots.txt is publicly available and anyone can see which sections you are blocking
- You don’t have to have a robots.txt file, but if you do it should be stored in the same directory as the .index file (generally the root directory)
- For more information check out robotstxt.org.
Does your navigation make sense? Are the labels optimised for users or search engines? Does your hierarchy keep important pages at the fore and do you surface other content?
- Best practices state that all pages within a site should be reachable in 3 clicks
- Optimise navigation labels but not at the expense of usability and understandability
- Use expandable/hover-over menus if you have a lot of sub-menu items
- Sequence menu items in order of importance – not only for link equity prioritisation but because users expect it in that order
Footer & Site-wide Links
Audit your footer and sidebar links. Do they all really need to be there or are you just trying really hard to get a page ranked better? The only links that need to be here should be things like “contact” or “about” which visitors might need to access at any given moment.
- Site-wide (footer) links should be reserved for general links that should be accessible from every page on the site (e.g. Contact, About, Terms and Conditions, Delivery, etc.)
- Site-wide links (such as footer links) should only be used for pages that SHOULD be accessible from every page (e.g. contact details). Don’t use site-wide footer links just to build up internal link numbers for keyword pages as this over-optimisation is considered spammy and not helpful to the user.
- To conserve “link juice” consider adding “nofollow” rules to unimportant site-wide links such as Delivery, Returns, T&Cs, etc. Allow Google crawlers to access the About & Contact pages as this will contain information important to search.
Social Media Integration
Share buttons are a given but what about Pinterest button overlays for images? What about “Click To Tweet” links? What about Facebook comment functionality? Think about what your visitors expect to see to help them get your content out there.
Much of the online world has a social media profile of some sort, compared to the comparatively few who own a website or blog. For this reason it levels the playing field for links/shares/citations/mentions as ranking metrics. There has been much debate in online marketing circles whether social signals affect ranking but it is safe to assume that if they aren’t already contributing to your ranking, they certainly will in the future. Make it as easy as possible for people to share your content and engage socially with your brand.Profile Links
- Add links to your social media profiles to help build followers & fans.
- Be clear as to what fans can expect by following you
- Social media can be a great customer service tool
- Displaying embedded feeds can be a great way of showing off your activity and attracting more fans
- Adding share buttons to appropriate pages helps build brand and content awareness leading to more traffic. Code to implement these buttons can be found here:
- Include share buttons on key content pages and product pages
#Pro Tip: Include a compelling call-to-action for individual buttons to encourage shares
Connecting a Google+ Page to your website and vice versa can give you a very rich looking SERP for branded searches. There is also the notion that an authoritative Google+ presence can help boost your search rankings.
- Connect individual authors on Google+ to their content with the rel=author tag
- Connect company pages on Google+ to their website with the rel=publisher tag
Google My Business
If you have a local service area or audience, claiming and optimising a Google My Business account should be near the top of your list. See how competitors are using theirs.
- Google+ Local is the SINGLE most important business directory in existance
- A Google local listing is a way of obtaining an extra 1st page listing (in addition to organic listings)
- More and more searches are conducted on mobiles – Google+ Local is mobile ready and hugely accessible
- According to Google, 97% of consumers search for local businesses online while 1 in 3 of all searches have local intent.
Learn more and improve your Google+ Local presence with this Local SEO Cheat Sheet.
A blog can help bring a huge amount of traffic to the site if utilised properly. Establish what purpose the blog serves and attribute metrics to discover if the blog is performing.
Content coming soon…
CDN (Content Delivery Network)
Targeting an overseas audience from a single website? The distance causes sites to load slower and CDNs host a ‘virtual’ version of your site in your target countries for quicker response and better rankings as a result.
Content coming soon…
Mobile / Responsive Site
Whether you opt for a responsive site or a mobile-specific site, having an online presence that renders properly and serves users on a myriad of devices is paramount – so much so Google recently released an algorithm update focused on the topic.
Content coming soon…
ccTLD’s (Country Code Top Level Domain)
This refers to the [.co.uk], [.it] or [.de] part of your site’s URL and should relate to the country you are looking to rank in. While there is little you can do retrospectively, a strong international strategy should be mapped out prior to launching overseas.
Content coming soon…
Search Engine Penalties
Ensure you don’t have any page-level or site-wide penalties, either algorithmically or manually applied by checking Google Search Console for notices or cross-referencing any extended dips in traffic with Moz’s Algo Change History.
Penalties can be enforced for the following reasons:
- Buying links from other sites
- Large scale reciprocal link arrangements
- Over-optimised anchor text or (very) limited spread of anchor text
- Links from irrelevant or wrong-language sites
- Engaging in blog networks and link farms
- Unnatural guest posting for the sole intention of getting a link
- Unnatural link influx (too many links, too fast)
- Hidden links, hidden content, hidden anything
- Excessive duplicate content
- Thin content page(s) for the sole purpose of ranking for keywords
- Scraped content
- Cloaking so Google sees one thing, and users see another
- Spun & poor quality content (grammar and spelling mistakes)
- Comment & forum spam
- Doorway pages
- Keyword stuffing
- Poor housekeeping – too many broken links, missing content, 404s
- Site hack
- Poor mobile experience
- Slow site and/or poor experience (as judged by Google’s quality raters)
- Bad ad-to-content ration (especially above the fold)
- Historic domain penalty or redirecting a penalised sites to yours
Page titles are the #1 on-page SEO element so make sure keywords are used strategically throughout all pages.
The page title is one of the most effective and easiest ways to rank for a search term. Make sure important keywords are near the beginning and aim for 70 characters (including spaces). You may also find the following tips useful:
- Numerics in page titles draw the eye and command a better click through rate
- Branding is important to Google so use your company name as a prefix/suffix if possible
Meta descriptions form the first step on the path to conversion. Make sure your compelling (and unique) descriptions speak to the searcher.
The meta-description tag will not directly influence your rankings but can help influence your click-through rate in the SERPs. As such, it is important to incorporate the following lines of thought:
- Appeal to the searcher by constructing a compelling description that suggests the page/site will answer their query or need
- Include keywords that the searcher will likely use so search terms are emboldened and the description stands out
- Where possible, include a call-to-action
- Remember to stick to 160 characters maximum (including spaces)
- Accurately describe what visitors can expect from your page as they will simply bounce off which negatively impacts user signals
- Make sure every meta-description is unique throughout the site
- Google won’t always display your hand-crafted description in the SERPs if it feels a snippet of on-page content fits the search query better
- If a page is shared on social media, it is likely that the meta description will form the default description in the post
- Avoid using quotation marks in your description as this will signal the end of the tag (as illustrated below). Use single quotes if necessary
<meta name=”description” content=“blah blah blah “quote” blah blah”>
Headers / H Tags
One of the most visible elements on a web page, headers should be optimised with keywords and audience in mind.
Headers and sub-headers are used to break up content and make it more readable. As they stand out from the text they should be used to engage and entice the reader. However, text strings contained within header tags are considered by the search engines as more important than standard text on a page and as such, you should also look to insert keywords where appropriate.
Optimised URL Structure
URLs are one of the first things a search bot “reads” on your site. Have you used keywords appropriately and does it look clean and tidy?
URLs should be clean, contain keywords and short if possible to enhance their ‘sharability’ (on social media, via email etc.), effectiveness in SEO and click-appeal in the SERPs. For example consider the following examples with (1) showing the unoptimised version followed by the optimised version (2):
Using the .htaccess file, re-write URLs the follow the guidelines below:
- Try keeping important keywords nearer the start
- Add relevant keywords where appropriate so they stand out more in the SERPs (don’t spam!)
- Where possible, dont bury products in endless sub-folders
- Make URLs understandable by both humans and search engines
- Re-write URLs that contain dynamic, system-generated rubbish (e.g. “/product_details.jsp?PRODUCTID=0937489324345244324562&FOLDERID=9823764891236
- Avoid underscores in favour of hyphens (as they look like spaces when underlined) – Search engines treat underscores and hyphens the same
Image Alt Tags
Google has made big leaps in “reading” images and identifying items and text within them but it doesn’t hurt to give them a helping hand. Also don’t forget what alt tags were originally intended for!
‘Alternative text’ or alt tags were introduced so that browser readers could “read” an image or the browser could display substitute text if for some reason the image didn’t load. However within SEO, the alt tags represent another instance where we can insert keyworded text. Every image should have an ‘alt’ tag assigned and where appropriate this should contain keywords and phrases that you are looking to rank for, or obtain traffic through Image Search.
Each page should have a unique purpose, a reason to exist. If it doesn’t, its simply there to manipulate the search engines which is not going to do you any favours.
“Thin” content refers to pages that either lack any content or lack the kind of content that fulfils a purpose or offers value. Often these pages will be disregarded by the search engines which is why it is important to ensure each and every page has a reason to exist with useful content supporting its existance.For example a product page should offer more than simply a line or two of a description from the manfacturer. Consider including the following:
- Interesting insights from the retailer
- Visual and stimulating content such as multiple/interactive product views and videos
- Highlight relevent stats and facts to show how the product enriches your life
- Include reviews from happy customers and press mentions
- Demonstrate benefits, not just features
- State unique selling propositions – why should they buy this product…and why should they buy it from you?
- Don’t underestimate persuasive copywriting but…
While search engines can only “read” text, they are getting better at understanding video and image content. Additionally, do you cater for your audience’s needs? What content types resonate with them? Rich, beautiful content is half the battle as Rand Fishkin would say.
Content just doesn’t mean words, content comes in many forms and the more variations you use the more memorable your content becomes because everyone consumes information in different ways. Consider supplimenting your written content with the following:
- Stats/Data visualisation/Infographics
- Embedded Games/Songs
- Downloads (e.g. PDF)
Internal Linking & Anchor Text
Internal links helps surface deeper content (or key products) so no matter how old or deep it is, if it is top notch make sure your visitors can find it with appropriate internal linking. Just make sure your anchor text begs to be clicked.
Linking to internal pages is important for several reasons;
- It helps conduct the flow of traffic through the site which improves on-page metrics
- It helps search engine crawlers find and index more pages
- It cascades link equity from more authoritative top-level pages to the deeper conversion and specific-content pages
- Don’t go mad on internal linking. Aim for no more than 100 per page and avoid linking to the same page multiple times
#Pro Tip: Don’t hide awesome content away. If you have produced a great article/blog post/video etc. link to it from a high-level page with an intriguing anchor text/image to help people find it!
- Optimised anchor text is no longer the most important factor for links (announced by Google back in April 2012) as they can determine the theme of destination page from other factors – Vary your anchor text to look more natural, with an emphasis on brand terms.
Pro Tip: Use intriguing language for anchor text to make visitors WANT to click on the link. E.g. instead of …[law jobs]… use a phrase to pique the reader’s interest such as …[earn upwards of £75,000 in these cool law vacancies]…
Flash is a no-no for so many reasons. Is your site using it to convey a message? Think of alternative ways to deliver that message.
- The use of Flash poses several problems:
- Flash is not supported by Apple products (iPhone, iPad etc.)
- Search engines find it hard to read and index Flash content
- Some companies block access to Flash sites and elements as it can pose a security risk
- Flash sites look great but should be avoided if you want to rank well. Consider investing in re-development if necessary
- If you must use Flash elements, make sure non-compatible devices can display a HTML alternative
Schema / Microdata
Schema and microdata are playing an increasingly important part in today’s SEO. Have you marked-up all appropriate elements such as videos, addresses, reviews, products etc?
- Schema is supported by the major search engines and is becoming increasingly important in describing your various content/site elements
- Schema can help a site achieve rich snippets in the SERPs
- Important Schema types include Authorship, products, reviews and videos – see schema.org for details
- Implementing Schema can be tedious – schema-creator.org can generate the code for you which saves a lot of time
- Implementing Schema should be as fundamental as meta data
For an excellent resource on Schema, how you can implement it and how it can benefit your site check out SEO Gadget’s excellent Guide to Rich Snippets.
When I say duplicate content, I’m not talking word-for-word, I’m talking about content that has no unique value – in as much as it is the same as a 1000 other pieces of content out there on other sites, or even within your own site. Every page should have a real purpose, if not, get rid.
Pages containing content that can be found on multiple other sites that offers no additional value can be ignored by Google, and site-wide duplicate content can even be penalised – particularly in the wake of the Panda algorithm update. eCommerce websites are particularly guilty of this with each retailer offering up the same “cookie-cutter” manufacturer’s product description.Your content pages are an opportunity to showcase your expertise in your products/service, establish the voice of your company and act as a virtual salesman on your behalf. Differentiate yourself from your competitors, give your customers the benefit of your knowledge and close more sales.
- Worried your site contains duplicate content? Copy and paste a chunk into Google and see how it returns in bold (remember, conjunctive words such as “and”, “to”, “of” and “in” will not be highlighted)
- Worried about people stealing your unique content? Employ tools such as Copyscape to guard your content.
Duplicate Pages & URL Variations
Determining if your site has duplicate pages or multiple URL variations for the same page(s) can help you eradicate duplicate content problems that can lead to penalties.
Multiple variations of the same page with different URLs should be avoided because:
- It can trigger duplicate content penalties
- Inbound links spread across multiple pages dilutes the equity of any given page
- It can be confusing for the user
- Updates to one page may not update the other leading to conflicting information
Common examples can often be seen on the homepage with…
…all existing simultaneously.
Additionally, site/navigation restructures, product updates, URL optimisation and ineffective CMS handling can lead to page duplication as illustrated in the next example:
Canonical tags, or preferably 301 redirects should be implemented to point users and bots to the correct page in order to capitalise on link equity and page authority.
Remove W3C Errors
I’m yet to come across a site that didn’t have at least some coding errors. Conducting a quick check can highlight syntax errors letting down your SEO efforts.
Entering your site’s URL into validator.w3.org will generate a list of any validation or indexing errors based on W3C’s best practice guidelines. Most of these will require an expert level of understanding in web coding but the service makes it as easy as possible to identify errors for a quick fix.
Broken links are frustrating to users and a confidence downer for Google. No one wants that so fix any broken links.
There is debate whether broken internal and external links actually hurt SEO directly but it certainly can hinder the user experience. Broken links are those that link to a page that no longer exists or has been moved and as such, the user hits a 404 error page. Often times these 404 pages offer no guidance on how to get back on track or find the resource/page which is incredibly frustrating.Use a broken link checker tool such as Xenu Link Sleuth (free) or Screaming Frog (free-limited) to find all instances of broken links. Both of these tools will crawl your site and identify instances of broken links quickly, with information on which page the broken link exists, the anchor text used and the broken link’s URL. Using this information you can quickly edit or rid the site of broken links.
I want to leave you with the list of tools I use to help me conduct SEO audits. I have these saved in a folder on my favourites bar for easy access and get many of them running the moment I start an audit. I have categorised them based on how invaluable they are and have noted what exactly I use them for:
All The Time
- Who.is – Who is hosting the site and when does their hosting contract run out?
- BuiltWith – What technology is powering the site in question and is it appropriate? What tools are they using which we can leverage?
- Google’s Page Speed Insights (alternatives: Pingdom | GMetrix) – Figuring out if a site page is fast/slow and what can be done to speed it up
- Google’s Mobile-Friendly Testing tool – Does Google see the site as mobile friendly (especially for important pages)
- Google’s Structured Data Testing Tool – Figuring out what site elements are marked up with Schema (and comparing that to competitors)
- Uptime Robot – Understanding how reliable the current hosting is over time and whether site outages are common
- Webhosting Hero – Is the site hosted in the same country as their target audience (and is the site is connected up to CDNs if they have international audiences)?
- Open Site Explorer
- W3C Markup Validation Service