Search Engine Optimisation (SEO) Agency UK

We’re Gorilla Marketing, and we make it simpler, easier and quicker to reach your SEO goals. Get your free SEO marketing audit and take the guesswork out of your journey. 

FREE SEO AUDIT

Who We Are:

Your Guide Through The Digital Jungle

Gorilla Marketing have an in-house team of experienced SEO & PPC experts who thrive on helping businesses improve their visibility in the search engines.

Further to our services and the results that we achieve, we also pride ourselves on our communicative approach to project deliverables and efforts to help our clients understand the value that we provide. 

Our Mission

To drive the growth and success of our client’s businesses, keeping them at the centre of everything that we do.

Our Values

To ensure that every campaign that is entrusted to us is managed with integrity and loyalty, with every step measured and communicated to the best of our abilities. 

 

Our Clients

Gorilla Marketing predominantly work with established businesses with an annual turnover between £1m and £25m. We have achieved outstanding success across multiple sectors and industries.

Meet the Gorilla Marketing Troop

With over  30 trillion web pages in the Google index, getting your business easily found by potential customers is akin to being lost in the jungle. Meet the troop who thrive in that environment. 

Kyle Clifford

Operations Director

Charlotte Woodend

Client Success

John Carey

SEO Manager

CJ Hamand

SEO Manager

David Galvin

SEO Manager

Liam Blackledge

SEO Manager

Paul Anderson

SEO Manager

Sam Etherington

SEO Manager

Jordan Bush

PPC Manager

David Lawes

PPC Manager

Alex Black

PPC Manager

Gemma Lutwyche

Head of Content

Kirsty Macdougall

Content Manager

Michelle Meyer

Content Manager

Jenny Oldham

Content Manager

Tracy Mitchell

Office Manager

What Is SEO / Search Engine Optimisation?

SEO stands for “search engine optimisation”. As the name suggests, it is the process by which you made improvements to a website, its content and backlinks to improve visibility in the search engines. It stands to reason that the better positioned your website is in the SERP’s (search engine result pages), that you are more likely to get the attention of a prospective customer. First, however, we need to understand the ‘how’ and ‘why’ of SEO, before we decide if we should. 

How Do Search Engines Work?

Mainstream search engines such as Google, Bing, Yahoo, Baidu & Yandex use bots to crawl web pages on the internet. This bot then feeds back the information that it has found to a central index; including the content, images, hyperlinks, videos and other website metrics. 

This data is then sent to an algorithm, which analyses all of this information as well as numerous other ‘off site’ factors.

Then, when a search query is made, the search engine will then consult the findings of this algorithm and present to you the results in the order which it deems as the most relevant. These findings are called the ‘search results’ or the SERP (search engine results page).

Search results for the phrase ‘SEO Manchester‘. Source – Google

Understanding The Primary Search Engine Results

Understanding the SERP’s is the first step when deciding if SEO is a viable marketing strategy for you. In the simplest terms, the search results on major search engines are typically broken into two primary categories, with additional sub categories of each (depending on the nature of the search). The first primary search result type, and most obvious one, is the natural result. Also called the organic results, this is where you will see all the web pages in the index that the search engine has determined are relevant to your query. 

The other primary type of search result is the sponsored search result. Usually at the top of the web page, with a discreet indication that is is an advert, these web pages have paid to appear for this search query. The order of sponsored search results are often determined by factors such as what the website is willing to pay per click (PPC), landing page quality score and the advert click through rates (CTR). 

Secondary Search Engine Results

The secondary search results display content that may answer a query more accurately than just a list of hyperlinks. These queries can be related to:

  • Images,
  • Videos,
  • Maps,
  • Shopping,
  • Flights,
  • Finance,
  • Books

 

Additionally, search engines are able to pull information from websites through schema to further enhance the SERP’s and make them more relevant to the query. 

A secondary organic search results for the phrase ‘SEO‘. Source – Google

Search Engine Optimisation & Why It's Important for Marketing

Search engine optimisation is the most effective type of marketing, bar none. Saying this, however, doesn’t detract from other viable channels such as television, radio and social media. SEO is just unique in that it allows you to position yourself in front of a customer at the exact moment they are looking for the product or service that you provide. 

With this in mind, there are potential downsides to SEO that need to be considered before starting a campaign. Firstly, it needs to be understood that SEO is not an overnight solution to generating more sales. The process can take months or even years to generate a return on investment, although after about four months it should be clear if your strategy is working. 

The Three Core Pillars of SEO

Whichever type of SEO campaign you pursue – whether it be local, national, or international – you’ll have to acknowledge the three core pillars of SEO.

These pillars include on-page SEO (things like keyword density and headings), off-page SEO (which includes practices such as link building), and technical SEO (page load time, server optimisation, site architecture, and more). These core pillars apply to all search engines, although each search engine weights them differently.

Analysing the silo structure of the Gorilla Marketing website.

On-Page SEO & Content

On-page optimisation (also known as on-site SEO) is mainly related to the content you write and how this interacts with search engine web crawlers. However, it also concerns other elements of your web pages that you have control over, and that your readers see, like meta descriptions and title tags.

In a sense, on-page SEO is quite literally anything that appears on your site that can be improved to increase your rankings. For the most part, it is the elements of your web pages that your audience will come into contact with upon visiting your site. In contrast, off-site SEO relates to elements that promote your page externally, such as backlinks.

To give you a better idea of what on-page SEO relates to specifically, here are the main factors you should consider when organising your on-site optimisation: 

 

KeywordsThe phrases that you would like your website to appear for in the search results.
Content RatiosThe ratio of content to code on a web page.
Page Title & Heading StructureThe headings on the page, outlining what the page is about.
Page SilosThe structure of the content on the website, further elaborating on a topic or area of expertise.
Internal LinkingSelf-referencing other content on the website to further elaborate on a topic, often done between page silos.

Keywords

Keywords are the primary means of informing Google crawlers of your page’s relevancy to the topic. Although Google’s algorithm has progressed massively over the years and introduced many other ranking factors along the way, keywords still define relevancy.

Keywords are a means for search engine web crawlers to index web pages that are relevant to specific search queries. They could be viewed as a few words that describe the product you sell or the service you provide.

For example, if you have a page about SEO in Liverpool, you might find terms like ‘SEO Agency Liverpool‘, ‘SEO Company Liverpool‘, and ‘SEO Services Liverpool‘. These key phrases, and the frequency with which they are used, signal to Google what your page is about. 

A keyword needs to be relevant to your business and also match words that web users commonly put into the search engines. To satisfy user search intent, the content on your page needs to match exactly what the keywords used suggest it does.

In years gone by, the power of on-page keywords led many content creators to keyword stuff, meaning they would overuse phrases in their copy with the hope that it would take the top ranking spot on the SERPs.

Over the years, Google has introduced new updates to ban and penalise websites that committed keyword stuffing, and it is now an obsolete way to make your site rank.

You also have to think about your readers here. A page stuffed full of keywords is unlikely to read well, so they’ll probably bounce pretty quickly. In turn, this will cause Google to rank your site lower (if everyone is bouncing so quickly, there’s obviously little of value to readers). 

A new Google algorithm update launched in July 2022 – the Helpful Content Update – which prioritises readability more than ever. In essence, content written for search engines rather than human readers is far less likely to perform as well as natural, readable content. Long story short, keyword stuffing just isn’t worth it.   

In summary, keywords need to be used smartly and sparingly to get the most from them. A much more in-depth keyword research process needs to take place. Specifically, you need to fit keywords in the following way:

 

 

URL

 

If you can naturally fit your main keyword in your web page’s URL, definitely do it. Although this is not essential, it helps with your site’s visibility. However, if the URL looks clunky or reads unnaturally with the keyword included, take it out. Also, avoid using stop words such as ‘and’, ‘the’, ‘for’, or any other preposition.
TitleNaturally embed your keyword somewhere in the title. If it’s possible, fit the keyword somewhere near the start of your title. Google – and other search engines – are known to rank pages higher if they mention their keyword at the earliest possible chance.
Meta DescriptionsFitting your keyword naturally into the meta description is important, as this forms part of what web users will see on search results pages. A search engine will often bolden keywords in meta descriptions, which attracts users to your page. As always, it’s important to do this in a natural, non-spammy way.
Meta TitleFitting a keyword in the meta title is a good way of conveying your page’s topic to the web crawlers. Like the meta description, it also gives web users a good idea of what your article covers. Using the main keyword in your meta title is one of Google’s best practices, so it’s definitely worth doing.
The First ParagraphYou need to convince the search engine bots from the get-go that your content has relevance, which is why you need to include your main keyword somewhere in the first paragraph. For best results, include the keyword somewhere within the first 100 words.
Alt Tags and Image File NamesSearch engine crawlers cannot process the contents of an image, so they need descriptive image file names and alt tags to assess the relevancy of the image.

 

When conducting keyword research, you may find that web users search for your services slightly different than how you’d naturally write about them. Perhaps they use specific words that are outside of your normal vocabulary. In which case, you’d have to slightly tweak how you talk about your products to rank well in the SERPs.

Keyword research can bring up many variations of your main keyword that people are searching for, so including these in your content can broaden your reach. 

You can also choose different types of keywords to target a particular audience. For example, those looking to buy a product have a different intent than those simply looking for information. So, look for keywords with commercial phrases such as ‘buy’ or ‘for sale’ if you are trying to sell your product. For informational articles, look for question-based keywords like ‘how to’, ‘what is’, and so on. 

Types of Keywords

Broadly speaking, there are two different types of keywords, short tail and long tail keywords. Short tail keywords – sometimes referred to as ‘head terms’ – are broad topic keywords. They should contain no more than 3 words. Short-tail keywords such as ‘travel insurance’ are the most competitive type of keywords as they are searched for most frequently.

Long tail keywords consist of phrases that are between 3 and 5 words. They are more specific phrases, such as ‘senior travel insurance’, and are used to target niche search results. Long tail keywords are searched for less frequently than short tail.

Keywords can further be categorised based on how they apply to a user’s search intentions. The main search intents include the following:

 

Informational – Informational keywords are question words. They are searched by web users who are looking to find a specific answer to a query – not to buy a product or sign up to your website.

Informational search queries normally start with ‘what’, ‘how’, or ‘why’. For example, ‘how to tie a tie’. If your company sells ties, this is definitely a blog post you’d want to cover. By being able to provide an answer for these terms, your site could also rank for informational searches.

Although this doesn’t directly lead to conversions, it does increase brand awareness. Once a web user recognises your brand as a useful source of information, they may turn to your site for products somewhere down the line.

Informational articles are essential for helping any site rank well. This type of content contributes to your E-A-T (Expertise, Authoritativeness and Trustworthiness), leading Google, and your readers, to turn to you as a trusted source of information.

 

Commercial – Commercial intent keywords can lead to more sales. Generally, these keywords are used by searchers to find out more information on a specific product. They may be looking for product reviews, top 10 lists, or product comparison articles. The search may look something like ‘best Chromebook 2022’ or ‘top 10 Chromebooks’.

These search types don’t necessarily mean that site conversions are guaranteed, but it does suggest that the web user has commercial intent. They are looking to find out which product to buy. By creating content that targets commercial keywords, you could set the site visitor up for conversion by recommending your product within the article.

 

Navigational – Navigational keywords relate to searches made by users who already have a result in mind. These searches know the page they are looking for, such as YouTube, eBay, or Amazon.

Navigational keywords can also include such things as ‘directions to X’, ‘cost of X’, and ‘X prices’. These searches suggest that the web user is already familiar with the brand, but wants to find out something specific about it.

Unlike the other types of keywords, there is little you can do about navigational keywords from a webmaster’s point of view. Unless you are a big name brand already, there is little chance you’ll be able to achieve steady navigational keyword traffic.

However, as your brand grows more recognisable (thanks to a successful search engine optimisation campaign!) you may start to receive hits via navigational searches.

 

Transactional – These keywords are all about conversions. A transactional keyword means that the searcher is ready to buy the product, such as ‘Hire SEO Agency in Newcastle‘, ‘SEO Services in Newcastle‘, or ‘Newcastle SEO Expert‘. The web user may have already completed a commercial search and is now ready to take action on a website and buy, reserve, or subscribe.

Keyword Cannibalisation

Cannibalisation is when your website has two or more similar articles competing for the same keywords. It’s called ‘cannibalisation’ as it leads these web pages to compete with one another and basically destroys the chances of either ranking.

Cannibalisation occurs when you produce two pages on your site that feature similar content, or you optimise them for the same keyword. Even if the keyword(s) are different, stark similarity can also lead to cannibalisation. For example, the keywords ‘comprehensive guide to SEO’ and ‘an explanation of SEO’, although clearly different, may still lead to cannibalisation.

It should definitely be avoided, as Google will only display 1-2 pages from each site in response to each search query. Don’t wrongly assume that the more you write on the same topic, the higher your website’s chances of ranking.

Not only does cannibalisation lead you to compete with yourself, but it also affects the quality of your backlinks and clickthrough rate. Cannibalisation dilutes the power of each, leading both of your pages to rank lower.

To judge whether your site is suffering from cannibalisation, simply type your domain name into Google, followed by the relevant keywords used. If you notice that both pages appear far down in the list of results, then your site is most likely being affected by cannibalisation.

To solve your cannibalisation, you have several options:

  • You could set the page you deem the most important as the canonical page.
  • You could merge the two (or more) pages together into one article. If it’s possible to combine the two articles into one piece without reducing the overall quality of the content, then you should definitely do it.
  • You could outright delete the weaker of the two articles. If there’s no way of merging the two, you should remove one of the articles and focus on optimising the strongest piece further.

Content Ratios

Although it remains unclear how much your content-to-HTML ratio affects your ranking, it definitely impacts many aspects of user experience. Mainly, poor content ratios can affect page speed, both for your site users and for the search engine web crawlers.

Although it may not be an explicit ranking factor, you’ll find that the vast majority of high-ranking pages have much more content than they do HTML code.

Content ratios relate to how much HTML code you have on a web page compared to the level of content. HTML is the invisible code that supports the page. It instructs web browsers on how to display the page. The more HTML on the page, the more ‘bloated’ the website becomes, which slows things down for the user.

Having more text than code fulfils the point of any website – to provide information targeted at humans! Less code also means browsers spend less time figuring out how to display your content. Finally, with less HTML information to read, crawler bots can read your page quicker.

For each of the above reasons, it’s important to have a lot less HTML than content. Ideally, you should aim for a 70% text-to-HTML ratio. To find out your content/code ratios, you could use an online code-to-text ratio checker. Alternatively, you could consider the amount of code you have on your page and compare this with your content level manually.

Page and Heading Structure

The main component of on-page SEO is the actual content, but another influential component is how you present this content. Laying out your page with clear headings and structure is key to search engine optimisation. It firstly makes it more digestible for site visitors, and secondly makes it more comprehensible to search engine robots.

Subheadings give the reader a natural break between segments. This is enough time to process the information they’ve just read and also prepare for the section they’re about to read.

Generally, web users don’t want to read massive blocks of text with no sign of indent or topic breakdown. Whenever they happen upon a page that has no sense of structure, they’re more likely to bounce than to stay and decipher the text.

Web users are looking for quick answers that are easy to read, not long and complicated answers. When a search engine notices a high bounce rate, it will penalise the page by bringing its rankings down.

Each header acts as an anchor for the text that follows. Not only does this allow for comfortable reading, but each heading announces the slight shift in the topic to the bots.

Page Silos

Siloing your website means organising your pages to make it simple for search engine crawl bots and web visitors to navigate your site. Also known as content clusters, silos are present on most websites today. This style of website architecture effectively groups together relevant pages and lists them in a logical order.

Rather than having your pages listed in a random order (known as a flat structure), page silos keep your content logically organised. Typically, each silo will begin with a landing page that sets the tone for the content that follows.

To create page silos, you need to first identify the main focuses of your website. If, for example, you are a search marketing company, your main topics would be SEO, PPC and so on. Each of these general tasks will serve as head silos, under which you’ll file web pages that are relevant to these topics.

You should be able to file each piece of your content under one of these silos, either through direct relevance or loose relevance. If you find yourself with many web pages that you are unable to categorise, see if you can find a common theme between them to create a new silo. ‘About Us’ and ‘Contact Information’ webpages can be filed individually in the root folder.

Based on the search marketing example provided above, the address of each silo should look like this:

  • https://gorilla.marketing/seo/
  • https://gorilla.marketing/seo/sheffield/
  • https://gorilla.marketing/seo/hull/

Once you’ve successfully reorganised each existing page, continue to add fresh content to each silo over time. If you notice one silo to be particularly lacking, focus on creating new content for it. As you’ve already established the theme of the silo, creating more direct and relevant content should be easy.

Internal Linking

Once you’ve organised all of your content into appropriate silos, start looking for internal link opportunities between individual web pages. Although you’ve just taken the time to separate all of your content, it’s still important to maintain links between content found in separate silos and between every page of the same silo.

Linking every page within a silo to every other page in that same silo is a great way to build topical authority and suggest to Google that you are the expert in this field. 

Also, internal links take site visitors from one part of your website to another. They are different from external links as they stay exclusively within the confines of your site. For example, if in an article on SEO you happen to mention algorithm updates, you should provide an internal link to one of your web pages that is related to Google updates.

This way, you’re helping search engine bots understand the links between and within specific subjects. Bots crawl between web pages via links, meaning you’re speeding up the crawl process by providing links for the bot to follow. You’re also increasing the dwell time of your site visitors by advertising other content that they may also be interested in. 

Off-Page SEO (Backlinks)

Off-page SEO relates mostly to backlinks – but what is a backlink? Backlinks are essentially the link from one webpage to another. You may have also heard them referred to as one-way links, incoming links, or inbound links. 

Off-page SEO is mainly the act of getting as many reputable inbound links pointing towards your website as possible. But why? They hold a huge amount of importance in the world of SEO as, to web browsers, they are viewed as votes of confidence from other websites. 

If a respected website thinks highly enough of your content that they want to reference it in their text, then search browsers have strong reason to believe that your content is trustworthy. The more backlinks you attain, the higher you’ll appear in the Google search results.

Backlinks and off-site SEO are a lot harder to control than on-site SEO. However, getting the right backlinks defines where you appear in search rankings, as Google has confirmed that it remains the second most important ranking factor in 2022.

There are many different backlink types that you’ll encounter, and getting the balance right is crucial.

Press release backlinks

Press release backlinks have long been a source of controversy in the world of search engine optimisation. This doubt is mainly due to the fact they were widely misused during the black hat era of SEO. A press release, in a nutshell, is the creation of an article detailing a new product, service, or feature of a business that gets sent for publication on a PR site.

This became a popular off-page SEO strategy when spam wasn’t as clamped down on by search engines as it is today. Webmasters started sending off poorly written and keyword-heavy press releases to newswire services, with backlinks awkwardly fitted throughout. Sometimes, press releases were being published that didn’t even announce anything new.

Once Google caught onto this, it rendered press release backlinks as being ‘black hat’ by making all associated backlinks ‘no-follow’ links. This meant that press release backlinks effectively stopped providing websites with authority and link power.

Although standard press releases no longer hold any SEO value, a well-written piece by a PR professional could create a considerable SEO outreach. A PR professional may be able to turn a press release into something newsworthy that a professional journalist may want to pick up.

If the story gets picked up by a journalist, they are likely to include a follow link in their article, which could feature on a prominent news site.

Therefore, although it requires a long-winded process, and there’s no guarantee of even attaining a backlink from it, carrying out press releases is still a valid off-page SEO approach.

Guest post backlinks

Guest post backlinks are one of the easiest backlink types to attain. It involves writing content for another website as a guest blogger, with a link to your website either at the start of the post (as part of an author bio) or placed throughout the copy. 

The best place to embed backlinks in a guest post would be throughout the content itself. This way, you’ll be able to use targeted keywords as the anchor text. 

To get the most out of a guest post, you have to put every bit of effort into making it a valued piece of content. Readers aren’t going to be interested in other things you’ve written if the guest post falls flat. 

To find guest posting opportunities, you could simply search for ‘guest posting sites’ on Google. This will typically throw up the likes of Mashable and Hubspot, which are both valid guest posting opportunities. 

Alternatively, whenever you happen upon a website you like, look for such phrases as ‘write for us’, ‘contribute’, or even ‘guest post’. This way, you could confidently reach out to the webmaster and request the opportunity to write for them.

Reciprocal link building

Reciprocal links involve two websites that link to each other. This link mirroring may come about naturally, but is more likely the result of communication and agreement between the webmasters of each respective page. To get the most out of reciprocal links, the links need to lead from and to the same pages.

Reciprocal links – also known as network links – work well as part of an SEO strategy, both for you and the other site you are linking to, as it allows you to effectively trade traffic.

This partnership-forming approach to SEO used to be a lot more popular in the early 2000s than it is today, but remains a valid way to drive traffic and gain link juice. The reason for this decrease in popularity is to do with Google clamping down on link spam to encourage more natural link building.

In some ways, focusing too much on reciprocal link building can cause you problems. If the majority of your backlinks are from reciprocal agreements, then this would look unnatural and forced in the eyes of web crawlers, especially if a number of these reciprocal links lead to low-quality, poorly optimised websites.

Therefore, we’d recommend spending time on other approaches to SEO and only initiating a reciprocal link agreement when it feels necessary and comes about organically.

Whenever the opportunity to create a reciprocal link with another site comes up, ask yourself the following:

  • Is the site good quality? If the website is poorly put together, unorganised, and sharing false information, then it’s highly unlikely that setting up link reciprocation will bring you any benefits. As a webmaster, you should be able to judge the quality of a website just by looking at it.
  • Does the website add value to your site? Relevancy should always be near the top of your considerations, as gaining backlinks from irrelevant sites can be very damaging to your credibility. Make sure the content and anchor text relate directly to your content.
  • Is the website a competitor? Just as you should never link to a competitor’s site, you should avoid setting up an alliance with your rival, too. The end goal of any SEO strategy is to beat the competition and reach the top of the SERPs. By establishing reciprocal links with the enemy, you could be helping them beat you to the top.

 

To give you an idea of who to build reciprocal links with, the best options include online directories and sites that specialise in creating statistics.

Iframe backlinks

An iframe (inline frame) is a type of backlink where part of the webpage appears on the linking site. The most common example of this is embedding a YouTube video within the HTML of the parent site. This video can be played on the parent site without the visitor having to visit the linked site.

Although most commonly used for videos, iframes can also be used to include analytics, graphs, interactive content, and advertisements.

The question of how much search engine optimisation iframe backlinks provide is up for debate. Although they are still technically backlinks, those who come across them don’t visit your site. With standard backlinks, the web user is forced to click on the link to view the content, but with iframe links, the content can be viewed on the parent site.

Therefore, the clickthrough rate associated with iframe links is generally smaller. However, a link to your website is still provided, which surely provides your site with some level of link juice, as long as the parent site is reputable. Google will still view this type of link as confirmation of your authority, too, so may rank you higher because of it.

Sponsored links

Sponsored links are a recent addition to Google’s approved link designations. Announced in September 2019, they allow websites to increase their brand awareness safely by buying links. Although paid links go against how Google works, this initiative was introduced to help new companies get their name out there, find an audience, and boost conversions.

It’s important to note that sponsored links do not help with SEO the same way that other backlinks do. They are generally nofollow links, so they do not give your page any link juice, and nor do they act as a vote towards your site’s authority. By focusing your SEO campaign merely on sponsored links, you are very unlikely to increase your site rankings.

Sponsored links – sometimes referred to as paid listings, partner ads, and paid links – merely act as a way for new customers to find your website and become familiar with your content and services. A sponsored link will look something like this:

 

<a href=”https://www.yourwebsite.com” rel=”sponsored”>example link</a>

 

There are two different types of sponsored links: sponsored placement and paid search.

Sponsored placement – This type of link may appear in the likes of a ‘top 10…’ product list and appear as affiliated links. You are essentially paying another website to promote your product as one of their top choices in exchange for money. They are useful for promoting your product and convincing web users of its value.

Paid search – Paid search is when you pay for a link to your site to appear in search results related to a chosen keyword. In Google AdWords, companies can bid to rank for certain keywords related to their topic. The highest bidder doesn’t necessarily get the top spot – Google will judge your site to see if it offers a good visitor experience.

The chosen ads will then appear for specific search results, based on keywords, location, and even device used. Paid searches will appear with a small box icon reading ‘AD’.

Web 2.0 backlinks

Web 2.0 relates to any online platform that encourages user-generated content. Although it includes social media, web 2.0 is not limited to Facebook, Twitter, and Instagram.

Web 2.0 tools include user-contribution-based encyclopaedias, such as Wikipedia, blogging sites like WordPress, Podcast Alley and other podcasting networks, reading logs like Goodreads, and learning technologies, such as Google Docs.

On all of these platforms, you could set up your account with the sole intent of promoting your website. Although it will be you that’s generating the website-referencing content, these web 2.0 tools have high domain authority. Creating web 2.0 backlinks can give your site significant amounts of link power.

These sites are also free to sign up for, making this backlinking opportunity highly cost-effective.

To get the most from web 2.0 backlinks, follow these steps:

  1. Sign up for a 2.0 site – Most web 2.0 tools are as simple to sign up for as a standard social media account. We’d recommend creating a new e-mail account just for your web 2.0 projects. This will help keep this part of your SEO strategy separate and organised.
  2. Optimise the URL – Although the URL will appear as a sub-domain of the 2.0 tool, it’s important to still aim to make this relevant to your product. Try to naturally include your primary keyword within. However, aim to make each URL for each web 2.0 tool you sign up for different. This will allow the backlinks to appear more natural.
  3. Cover the basics first – If you’re signing up for the likes of WordPress, aim to cover the basics first. By this, we mean your contact info, about, privacy policy, and terms of use. This increases your page’s overall trustworthiness.
  4. Write content – This part takes the most effort! You should aim to complete around 20 pieces of individual and original content for your new web 2.0 account. This content, of course, cannot be copied and pasted from your other site and needs to be 100% original to avoid plagiarism. You need to establish your new site as a useful resource.
  5. Embed backlinks – You should wait until you’ve published a decent amount of content and built up somewhat of an audience before you start sticking backlinks into the content. As always, it’s important to make your link-building process appear as natural as possible. Gaining many backlinks to a newly-published piece of content on your low-ranking site will always look suspicious, however drip-feeding links over time will appear much more natural, even if you are gaining them through non-organic sources such as guest posts. 
  6. Keep publishing content – The best approach to web 2.0 backlinks would be to steadily build up content and backlinks until Google indexes your site. After which, you should set up a posting schedule to post regularly. Avoid including backlinks to your site in every piece, as Google could interpret this as spam.

 

Overall, make sure that your web 2.0 content offers value to the reader. It’s worth including a few links to other websites, too, to make it appear more convincing to the audience as well as Google’s bots.

You should still put just as much effort into your web 2.0 content as you would for your site’s regular blog posts. The aim is to get your 2.0 content ranking and authority from Google’s perspective to increase the backlink power. As a rule of thumb, don’t post anything on your web 2.0 profiles that you wouldn’t on your website.

Bookmarking websites

Social bookmarking sites include the likes of Reddit, Stumbleupon, Digg, and Pinterest. They’re referred to as social bookmarking sites as users link, or ‘bookmark’ specific pages of the internet within these sites, with the intent of sharing what they’ve found with others. 

In terms of SEO, you could use these social bookmarking sites as a vehicle to get your website indexed while granting you several additional benefits along the way. Search engines regularly crawl the major social bookmarking sites – particularly the likes of Reddit and Pinterest – and follow the links they find along the way.  

Not only does it help with indexing, but most search engines count social bookmarking as a regular backlink. Furthermore, getting your site referenced on the likes of Pinterest is an easy way to drive traffic to your site.

Social media backlinks

There’s a common misconception that social media backlinks don’t count for anything. Whenever your site organically gets mentioned on the likes of Facebook, Twitter, Instagram, or LinkedIn, it counts as a backlink, just as it would on any other website. Social media platforms are websites after all!

There are many ways that you can submit a link to your website on your social media accounts to act as a backlink. On Facebook or Twitter, for example, you could add a backlink to your bio section. We’d recommend making this section of your bio public so that anyone who happens upon your profile can see this link and visit your website.

Whenever you post a status, tweet, or share something on one of your social media accounts, you could embed a link back to your website within the post. Additionally, whenever you post a video on the likes of LinkedIn, be sure to add a link to your site within the description.

Lastly, when setting up a profile on the likes of Pinterest or Instagram, there’s the option to add a profile link. This appears separate from your bio and sticks out to site visitors.

Local citation backlinks

Local citation backlinks are essentially backlinks that provide specific information regarding your location. It arguably plays the most important role in any local SEO campaign, as it helps web users in your area find your business. The data displayed in a local citation backlink is your NAP info. NAP is an acronym for Name, Address, and Phone number.

Besides NAP data, local citation backlinks can also include the following:

  • Business opening hours
  • Images (which can be a massive draw factor for web searchers)
  • E-mail address or social media account links
  • A short business description
  • Driving directions

 

A direct link to your website is also included. This type of local citation backlink is, of course, the most valuable for SEO. By getting your address information out there, you’ll be able to appear in location-based search queries.

For example, if you own a coffee shop in X location and somebody types something like ‘best coffee shops in X’, or ‘coffee shops near me’, your business will likely show up with enough local citation backlinks.

There are two main types of local citation backlinks: structured citations and unstructured citations.

  • Structured citations – A structured citation appears as an address normally would. Starting with the street address, then followed by the town, region, and postcode, each of which with its line. Below this, you’ll likely find the phone number, web address, and map link.
  • Unstructured citations – Unstructured citations are mentions of a business address found within a body of text. These are normally found in blog posts and don’t tend to provide as much information as structured citations.

 

To gain quality local citation backlinks, you need to submit your business’s NAP data to several different data aggregators. For UK-based businesses, you need to get your site listed on such directories as Yelp, Yell, 192.com, FreeIndex, Cylex UK, UK Small Business Directory, Bing Places, and, of course, Google My Business.

It is free to get your business listed by many of the above directories. Additionally, you should make sure your NAP data is accurate on your social media accounts, as well as on map applications such as Google Maps.

Unlike other backlink types, collecting local citation backlinks for your site is not only about boosting your webpage authority. Although it does boost your trustworthiness, local citations primarily work to convince Google that your business actually exists.

If Google and other search engines can find the same NAP data linked to your business across various websites, then the browsers can trust that your business exists. The more local citation backlinks a browser can find, the more likely it is to promote a website for location-based searches.

Image link building

Image link building is the process of creating quality images for the web that other sites will likely use in their content. Every time another site uses one of your images, a backlink is established to your site. A lot of start-up companies may not have the money or resources to create their own graphics, so they tap images from other sites for the time being.

Image link building is one of the most complicated backlink tactics to carry out. Not only do you have to create your own images, but they also have to be of a good enough quality for other sites to want to use them. However, it is a link-building approach that offers a huge payoff. Once your image gets used once, it will likely get used a multitude of times.

Once you’ve created a graphic, using it for search engine optimisation is easy. All you have to do is create an original URL for the image, which can be done on imgur.com. This will give your image an original title.

Besides images, there are several types of graphics you could produce to aid in creating a backlink profile. These include:

  • Product photos – If you’re selling a product that other websites stock, producing high-res images is definitely worth doing to secure backlinks.
  • Infographics – Making tables, charts, and other informational graphs that display up-to-date and topical information is invaluable for a website’s backlink campaign. You may regularly have to create new graphs (it’s rare to find a graph that remains evergreen) but the number of attainable backlinks definitely makes it worthwhile.
  • Maps – Maps displaying specific information may take a lot of time to create, but are something that other sites will definitely want to use.

 

Honest sites will give your website attribution for using the image, but sometimes websites may only reference the image URL, with no mention of your website. This is not good for your search engine optimisation, as it provides no way for web users to find your website and gives you no link authority.

Whenever you come across unauthorised uses of your image, you should reach out to the website and request a proper citation. You could monitor the usage of your image by reverse image searching.

This can be done by uploading the image to Google or by pasting its URL into the search bar. Take time to sift through the sites that use your graphic, ensuring that each is providing the correct citation.

Backlink Velocity

Backlink velocity relates to the speed at which backlinks that lead to your site are created. Typically, search engines measure backlink velocity based on the number of links you attain per month. Common sense may lead you to believe that a higher backlink velocity means higher and quickly attained rankings, but this is not always the case.

Building backlinks at too fast a rate is generally treated with suspicion by search engines. Particularly if your website’s backlink velocity picks up suddenly, like going from 100 backlinks one day to 1,000 the next. This looks like you’ve implemented some form of shady SEO technique, such as buying backlinks in bulk.

To maintain the trust of search engines, your backlink velocity needs to increase naturally. Requiring roughly the same amount of backlinks each week makes your site look a lot more trustworthy. A slow increase – for example going from 10 backlinks acquired one week to 15 the next – is also a believable velocity in the eyes of the search engine.

Anchor Text Ratios

Anchor text is the text featured in an article that hyperlinks are attached to – the clickable, blue, and underlined part of the piece. It is used for both external and internal links.

The words you choose have an impact on the power of the link. It explains to both the bots and the site visitors the hyperlink’s relevance to the topic at hand. Anchor text can be viewed as the context for the link.

Many rookie content creators underestimate the power of the anchor text and choose words at random, which limits their chances of getting into the SERPs. There are several different anchor text types, and getting the ratios right is key to getting your content ranking.

Before getting into ratios, it would be valuable to identify the main anchor text types. There are many types of basic anchor texts that you’ll encounter regularly.

 

Exact match keyword anchor text

  • Keyword – This anchor text type matches one of your main keywords or phrases exactly. Exact matches hold the most anchor text power. An exact match keyword might be ‘PPC advertising’.

 

Key phrases mixed into anchor text

Key phrases mixed into anchor text refer to any hyperlink that features all or part of your keyword phrase, mixed in with other words.

  • Partial keyword – Partial keywords feature only half of the full phrase. For example, it may only mention 1-2 words of the full keyword. An example would be ‘PPC strategies’, a part of the full keyword ‘PPC strategies for Amazon sellers’.
  • Page title – This anchor text uses the official title of the article referenced, word-for-word. This could be an exact match keyword if your title features your main keyword, which it should. For example, ‘a complete guide to PPC advertising’.
  • Keyword plus – A keyword plus anchor text features one of your main keywords plus additional words. These additional words may or may not be secondary keywords. For example, ‘PPC advertising techniques’.
  • The brand plus keyword – This keyword type features both the name of your brand plus a keyword. For example, ‘Gorilla Marketing PPC advertising’.

 

Brand, URL, and natural anchor text

This relates to anchor texts that feature reference to your brand name.

  • Brand – Brand anchor texts fit the hyperlink on the name of your brand, for example ‘Gorilla Marketing’.
  • Full URL – One of the less attractive approaches to anchor text, full URL hyperlinks literally contain the full address of the website. This can appear very clunky so is not the most common way to link to a page. A full URL would be something like ‘https://gorilla.marketing/’
  • Home page address – This type of anchor text features the URL for the homepage of the referenced site. For example, yoursite.com. Home page address differs from full URL anchor text as the latter references the address of a specific page, as opposed to just the home page. Using our example above, we’d get ‘gorilla.marketing
  • Generic – This anchor text type doesn’t mention the brand name or any keyword phrases. Perhaps the most common examples would be ‘click here’ or ‘learn more’. Generic anchor texts seemingly feature no reference to the main keywords of the article linked. They are natural in the sense that they don’t appear forced for SEO purposes in the eyes of web crawlers.
  • URL with www. – Similar to full URL, this anchor text type starts with ‘www.’ instead of ‘http://’.
  • Image -The anchor text is an image instead of text. This image won’t feature alt text.

 

In order to succeed at SEO, you need to get the ratios between these anchor text types accurate. The ratios differ depending on what page of your site these backlinks lead to.

 

Homepage anchor text ratios

Homepages are all about boosting your brand and gaining trust from both site visitors and browser crawl bots. Making sure that your homepage anchor texts reflect this purpose is important.

Given that your homepage is unlikely able to answer specific search queries, it doesn’t have to be as keyword heavy as your inner pages. This is also reflected in the anchor texts.

The ratios for backlinks that lead to your homepage should be close to the following:

  • 85% of your backlink anchor texts should fall into the ‘brand, URL, and natural anchor text’ category.
  • 10% of your backlink anchor texts should fall into the ‘Key phrases mixed into anchor text’ category.
  • 5% of your backlink anchor text should fall into the ‘Exact match keyword anchor text’.

 

Inner page anchor text ratios

Most of your SEO campaign involves the inner pages of your site. These are the pages that should be keyword rich and aim to answer specific search queries. For this reason, your inner page anchor text ratios look very different to that of your homepage.

They should predominantly feature keywords mixed with other non-keywords. Receiving backlinks with anchor texts that only feature your main keywords will be flagged as suspicious by Google. It is highly unlikely that every website that provides a backlink to your website will naturally use your keyword.

Backlink inner page anchor text ratios should be close to the following:

  • 35-45% of backlink anchor texts should fit into the ‘brand, URL, and natural anchor text’ category.
  • 50-60% of backlink anchor texts should fall into the ‘Key phrases mixed into anchor text category.
  • No more than 10% of backlink anchor texts should fall into the ‘exact match keyword anchor text’

Nofollow / Follow Links

Nofollow and follow links both bring your page traffic but differ in terms of SEO. Follow links – or dofollow links, as they are sometimes referred to – will help you climb up the search results, while nofollow links don’t.

 

Nofollow

Nofollow links, which were first introduced in 2005 by Google, appear as rel=”nofollow” in the HTML tag. Nofollow links can be clicked on by web users, meaning that they still bring traffic to your site. Despite their HTML, Nofollow links appear exactly the same way as dofollow links do.

However, this HTML tag defines how Google and other search engines interpret the link and tells web crawlers not to visit the site. This means that nofollow links don’t get indexed or receive the same link juice that dofollow links get.

This HTML tag was originally introduced to curb comment spam. Before, webmasters could fill the comment sections found on other websites and blogs with links back to their own websites. Through this, they could get pages indexed by Google and gain some ranking power.

After the link type was introduced, all comment links automatically became nofollows.

Given that they don’t really give you any SEO power, you’re probably wondering if it’s worth including nofollow links in your SEO strategy. As long as you are using nofollow links responsibly, they cannot harm your site. If you’re looking to merely drive traffic, then strategically using nofollow links could help promote your content.

You should still avoid spamming comment sections with nofollow links. Although search engine bots ignore nofollow links, spam is not appreciated. Instances of it will likely result in your site getting penalised.

 

Dofollow

Dofollow links are backlinks that Google crawlers visit whenever an encounter is made. A dofollow link can help your page get ranked in the SERPs. This link type works as a sign of approval from another site. If this other site is reputable, the dofollow link will bring your page lots of link juice and bump up its position in the search results.

On top of this, dofollow links also bring you an audience. Therefore, these links are of greater value in terms of search engine optimisation.

Technical SEO

Technical SEO refers to the technical aspects of your web page and the improvements that can be made. Effectively a subcategory of on-page SEO, technical SEO relates to such ranking factors as page speed, website response codes, and the mobile-friendliness of your site pages.

Getting your technical SEO right is pivotal to the success of your webpage. For search engines to effectively crawl your website, your web pages need first to be working and technically up-to-date. You want to make it as easy as possible for Google and Bing to assess the relevancy of your site and you can do this by making the technical side of your site more comprehensive.

Unlike off-page SEO and other types of on-page SEO, technical SEO improves user experience for site visitors. First and foremost, getting your website to work well should be for the sake of your users. Fast load times and easy accessibility on mobile devices will encourage users to return to your site and increase your conversion rates.

If Google recognises the technical soundness of your web page, it will want to recommend it to web users – and it does this by ranking your page higher. Here are the main technical SEO ranking factors you need to consider:

Website Response Codes

Making sure that your website’s response codes – sometimes referred to as ‘header response codes’, ‘hypertext transfer protocol status codes’ or simply ‘HTTP Status Codes’ – are accurate is an essential, and very technical, aspect of SEO. In order to check the response code status of your site’s HTTPs, you could use a server header checker.

After you’ve run your site through a website response checker, all working pages should be returning a 200 code, which means ‘OK’. At the same time, all pages on your site that no longer exist should be returning a 404 code, which means ‘Page Not Found’.

There are several types of code that you could get back from a server header checker. These codes include:

  • 1xx codes – Codes that start with 1 are informational and are generally irrelevant to SEO.
  • 2xx codes – Codes that start with 2 are called ‘Client Success Status Codes’, and mean that the page is loading and working fine from the site visitor’s point of view.
  • 3xx codes – ‘Redirection Status Codes’ start with a 3. These codes communicate to Google that a page features a redirection, and also whether this redirection is temporary or permanent.
  • 4xx codes – These codes are known as ‘Client Error Status Codes’ and are displayed whenever a webpage is not loading properly or not loading at all.
  • 5xx codes – Codes that start with a 5 are titled ‘Server Error Status Codes’ and are used to communicate that the page is experiencing issues with the server level.

 

Making sure that all these codes are accurate is necessary for SEO, as an inaccurate code will be treated with suspicion by Google. If a page is fully functioning but displays a 404 code, Google may assume that the page is duplicated and penalise your site for it. It could also lead Google to believe that the rest of your site is 404 and therefore unrankable.

Robots.txt

Including a robots.txt file on a webpage URL can help search engine bots when crawling your website. If there are pages of your website that you don’t want to rank, such as a policy page, then including robots.txt will inform the robots not to analyse the webpages in question.

There are many reasons why placing robots.txts wisely is useful for SEO, including that it forces crawlers to focus only on the webpages that matter.

As a robot crawls over your website, it only has a limited capacity for how much content it can analyse, or how much time it is willing to spend on your site. This ‘crawl budget’ is based on your site’s scale and reputation. By including robots.txts, you are managing your ‘crawl budget’ wisely and not wasting the robot’s energies.

It also means that you can stop crawlers from visiting pages that you have not yet fully optimised, or have SEO-related issues that may impact your site’s overall reputation. You could add a robots.txt file while you fix SEO problems on the page and then remove it once you’ve readied the content.

It is important to note that adding a robots.txt to the page URL does not stop it from being indexed. This means that the page in question could still appear in search results. However, the search engine won’t be able to display meta descriptions or any information about what’s on the page as it does not have access to the actual content.

Page Speed

Internet users don’t have the patience for websites that lag, and neither do search engines. Google knows that a web page that takes longer than three seconds to load has a significant impact on user experience.

Search engines are highly unlikely to rank a page that has a particularly slow running time. Both Google and Bing feature page speed time as a direct ranking factor. Besides being massively impactful on how well your site ranks, page speed is the factor that most impacts user experience. Slow load times lead to high bounce rates and lower dwell times.

Page speed load time can be impacted by several factors, including unoptimised code, caching issues, media file load times, bulky code, and script issues.

Given the increasing focus on page load times, Google has created this page speed analyser to help you gauge your page loading times to the millisecond. Specifically, it looks at such factors as Time to First Byte (TTFB) and First Input Delay (FID).

If you find your pages to be lagging, you could speed up your website by carrying out the following:

  • Reduce the ‘HTTP requests’ by keeping plugins and scripts to a minimum
  • Opt for fast hosting
  • Choose a fast domain name system (DNS)
  • Keep image file size small, but don’t pixelate
  • Use tools like GNU Gzip to compress your web pages
  • Only use one CSS (Cascading Style Sheets), as opposed to multiple or inline CSS
  • Shorten your site’s code by removing unnecessary spaces, indentations, or line breaks from your CSS, HTML, and Javascript. For more information on this, check out Google’s Minify Resource page.

Sitemaps

Creating a sitemap for your website helps the search engine bots navigate around your site. Sitemaps come in the form of an Extensible Mark-up Language (XML) file and are essentially a long list of all the URLs on your website.

This map is generally categorised into subcategories such as pages, tags and posts. Valuable data such as ‘last edit date’ and image info can also be found in a sitemap.

A sitemap is an easy way for a webmaster to inform Google and Bing about the sections and layout of the site, and to ensure that nothing gets left out of the crawl. It is not an essential part of SEO, however; webmasters who are confident in the comprehensible layout of their website often forgo this stage. However, creating an accurate sitemap won’t do your site any harm.

Your sitemap should only feature web pages that you deem to be important. When creating a sitemap, you don’t have to include pages that you have added robots.txts to the URL. You also don’t have to include any pages you don’t think are useful to the crawl, even if you haven’t added a robots.txt to the URLs.

To create an XML sitemap, there are a number of online tools you could use, including All in One SEO.

Images

Not only can you manipulate your text content so that your webpages appear in search results, but you can also optimise your images, too. Images get indexed during the crawling stage, so it’s important to make the data attached as descriptive as possible. This assists search engine bots to more accurately file your images and also better understand what your website is about.

Alt attributes can be added to each image, which is basically a short description of what the image is. This helps search engine bots out, but it’s also beneficial for your site’s users. Connection issues or your files getting deleted could mean that your customers don’t get to see the images provided, so a short description will help give them an idea of what’s missing.

In the file name, alt text, and caption, you can be clever with keywords and naturally include a few in your description. Just like with standard content, you should avoid keyword stuffing, and try to be as natural as possible with your image descriptions. Still, you should aim to provide as much information as possible.

By accurately describing your images and naturally including keywords, your images will more likely appear on image search, increasing your overall traffic.

Redirects

Redirects can be useful in many instances, including:

  • Whenever you find a broken URL
  • When you move your page to a new location
  • If you’re rebranding your URL
  • When you otherwise need to delete a page (if, for example, you no longer sell the product that formally occupied the page, or you’ve changed the date in the URL)

 

When you’re deleting or removing the content found on a page, you need to provide a redirect in order to avoid creating multiple dead ends on your website.

The link to the page still exists and will remain indexed by Google. It may be included as a backlink on other websites or was posted on social media in the past. For this reason, web users may still land on the former page.

In order to keep your site optimised, you need to put in the redirect correctly. In most cases, you’ll need to use a 301 redirect. This code tells Google that the redirect link put in place is permanent, and, as the webmaster, you have no intention of reactivating the page or reversing the changes made.

When Google reads a 301 code, it will shift all the link power and SEO to the new page, meaning that the new page will take the ranking place of the former.

If you’re only planning on redirecting traffic temporarily, and you have plans for the former page in the future, then you should use a 302 code. This code represents a temporary redirection and will tell Google only to move some of the link power to the page it redirects to.

Hreflang Tags

Hreflang tags are an essential part of technical SEO for websites that are available in multiple languages or target various different countries around the world. They are essential for getting search engines to properly index your web pages and help site visitors a lot.

The aim of inserting a lang tag into your URL is to make the search engine understand that there are similarities between the two pages, but that they are not duplicates.

Lang tags also help avoid being penalised for duplication if you have both a British English and American English version of the same page. Although there are obviously differences between the two language types, large stretches of text may end up being identical due to the number of words that are spelt the same in both languages.

Hreflang tags should look like this:

<link rel=”alternate” href=”https://yourwebsite.com” hreflang=”en-gb” />

This example suggests to Google that the language found within the article will be British English. One mistake commonly made by people inserting hreflang tags is to put ‘uk’ instead of ‘gb’. Hreflang tags can be added to the HTTP response header, HTML <head>, or your XML sitemap.

Canonical URLs

If you’ve got two pages with similar content, you could make one of the pages the ‘canonical URL’. Search engines can penalise websites for duplication, even when it’s just between two pages. Any form of similar content found on another page will confuse a browser during the crawling phase and impact how well each page can rank.

Of course, you should make each page of your website unique from the other in order to increase the ranking chances of each. Multiple instances of duplication, however small, will lead Google to consider your content to be of a low quality.

Even if you haven’t copied and pasted content onto a different article, your website could still experience duplication issues. This can be caused by the general layout of your site or another technical issue.

You can check for duplication issues by running your site through Screaming Frog SEO Spider or Moz Analytics. Such tools will be able to identify hidden instances of duplication.

Whenever you do find an instance, you could set the most important webpage as the canonical URL. This way, Google will know which one to focus on, and no longer consider the similarities as an instance of duplication.

To set a page as canonical, you could use the ‘rel=canonical tag’ in the page’s URL. Alternatively, you could set the selected canonical page by 301 redirecting traffic towards it whenever they visit one of the similar webpages or providing internal links to it throughout the other articles.

When constructing a sitemap, you should only include canonical webpages so that the Google crawlers only visit these pages. Even if you haven’t set one of the included URLs as a canonical page, the search engine bots will consider all the URLs provided as being the proposed canonical pages.

Schema

Schema mark-up is essentially the creation of a more informative micro-description that appears in the search results. Interestingly, schema is not actually considered to be a ranking factor by the main search engines. However, it is still likely to increase your clickthrough rate by making your URL more attractive, especially if the competition doesn’t include schema.

Schema structured data most commonly comes in the form of a star rating, which is particularly helpful if your website is offering a product or service. In this case, even if your competition is also using schema, then you can get the upper hand if you have a higher star rating than them.

Schema mark-up can also come in the form of recipes or instructions, which is helpful if you run a cooking or DIY website. It can also come in the form of detailed answers. This proves to web users that your site has the knowledge they’re looking for without even clicking on your website.

Schema mark-up not only makes your web pages more technically attractive to web users but also helps search engines better comprehend what it is you are providing.

This is particularly important if the main keyword you use is a homonym, for example, ‘glasses’. This word could imply either drinking glasses or eyewear. Schema can help differentiate between such instances.

The best way to add schema mark-up to your website is through Google’s Structured Data Mark-up Helper. Here, you can select the type of data you want to show in the schema, for example, your business or product. It’ll then ask you to paste a link to the relevant article and select the relevant information you want the schema to show.

The best part of schema is that there is a common language amongst all the main search engines. That means if you can get your web page showing schema on Google, then you can also get it to appear on Bing, Yahoo!, and other search engines.

Mobile-friendliness

As of 2021, 64% of all search queries were made on mobile devices, according to Sistrix. Most keywords are also identified by the way that mobile users search. In some instances, website traffic will be made up of solely mobile users. Not ensuring that your site is mobile-friendly is, therefore, losing out on a huge part of the online market.

To gauge how much of your audience are mobile users, check out the Audience » Mobile section of Google Analytics. Here, you’ll find a percentage breakdown of your mobile, tablet, and desktop users. You can also check the onscreen date selector to see how much your mobile audience has increased over the space of a year.

You need to make all of the features of your site – navigation, drop-down menus, page load times, layout, etc. – work to the same level across all devices. Upon entering a site that is designed for desktops exclusively, the limitations are immediately obvious to the user. Page proportions will appear off, and navigation will be sticky, leading them to bounce.

Not only is making your site mobile-friendly essential to establishing a dedicated userbase, but it’s also been part of Google’s ranking factors since 2015. Additionally, Google doesn’t rank the different versions (desktop, mobile, etc) of a web page separately, but in fact, has a mobile-first index. This means they’ll primarily index the mobile versions of web pages.

If you’d like to find out exactly how mobile-friendly your site is, you could paste its URL into Google’s Mobile-Friendly Test. Once it has analysed your site, this tool will give you a simple answer to whether or not your site is mobile-friendly. You could also try out the browser’s mobile speed test to find out more about the performance level of your site.

If you find that your site doesn’t perform well on mobile devices, try carrying out the following steps:

  • Compress your images, allow browser caching, and switch to another server in order to speed up your mobile website.
  • Make CSS and images lighter, reduce the size of your content, and use a large and attractive font in order to make your articles readable on mobile devices.
  • Don’t use Flash, and remove pop-ups from your site.
  • Add a viewpoint meta tag so that the browser can adjust the scale of your website according to the device it’s accessed on.
  • Keep your webpage layout minimal by only including a few essential functions on the homepage. Declutter as necessary and aim to make your interface as clean and organised as possible. 

SEO Campaign Types

You can shape your SEO campaign to target local, national, and international customers – each of which is considered a different SEO campaign type. The method you need to implement to target each differs massively – with some requiring a heavy focus on geo-tagging, while others are more keyword orientated. 

The title of each campaign type may make its purpose obvious, but the titles alone don’t clue you into what each entails. Let’s take a closer look at local, national, international, and, as a special bonus, e-commerce SEO.

Local SEO

The intent of local SEO is to optimise a website so that it appears in local search results. The types of search terms that local SEO aims to answer include things like ‘local hairdressers’, ‘barbers near me’, or ‘hairdressers Manchester’

Local SEO is more often than not the easiest campaign type, as competition is minimal in comparison to international SEO. 

It’s unclear what defines the local SEO algorithm, but it most likely has a lot to do with proximity, NAP citations, and review signalling. 

Proximity relates to how close your business is to the person searching. It’s not really possible to optimise your proximity in any other way than registering your business on Google My Business. 

NAP citations are how many sources detail your name, address, and phone number, while review signalling is how well reviewed you are online.

National SEO

National SEO is one of the most common SEO campaign types, given how many webmasters run a country-specific blog or own an e-commerce store that delivers within a specific country. 

For this reason, national SEO is far more competitive than local. It involves basic SEO tactics, including keyword research and backlink building.

International SEO

International SEO takes national SEO to the next level, as you need to create content that targets several countries at once. Not only does this mean you need to increase your keyword research, backlink building, and the amount of time and effort you put into your campaign, but you also need to introduce new tactics. 

Adopting several subdomains is the first thing you need to do, one for each country you intend on targeting. Having an American subdomain, for example, would automatically change the language of your page language to U.S. English when viewed from America. 

Alternatively, you could make different pages for each language you target, using hreflang tags in order to avoid duplicate content issues. 

E-Commerce SEO

E-Commerce SEO is basically just SEO that applies to commercial pages of your website. If you have an online store, you’ll need to make sure that each page is fully optimised for keywords, offers a sound user experience, and has a significant amount of relevant and trustworthy backlinks pointing to it. 

Besides the standard SEO necessities, in order to get your e-commerce page ranking in the SERPs, you need to make sure it features the following:

  • A detailed and unique description – Unique content is, of course, always an essential part of SEO. When it comes to e-commerce, the description of each product needs to be unique to the page and also go into a lot of detail about the product being sold. 
  • Basic functionality – A product page should provide a description of the product, images of it, customer reviews, and the option to buy the product – nothing more, nothing less. 
  • Lots of user reviews – The more five-star ratings a product gets, the higher Google will rank it. 

 

The above applies to the following: 

  • Shopify
  • Woocommerce
  • Big Commerce
  • Magento

SEO Techniques

There are countless SEO approaches, but not all tricks are recommended for your SEO strategy. Although there are ways of achieving quick-fix clickthrough rates and high rankings overnight, the majority of these methods don’t agree with Google’s best practices. Shady approaches can end up backfiring, and irrevocably damaging your website’s credibility.

Generally, there are white hat SEO techniques and there are black hat approaches. These titles take inspiration from western movies, where the good cowboys would wear white hats and the villains would wear black. White hat refers to any SEO trick that is in line with Google’s best practices and doesn’t break any rules.

Black hat, on the other hand, goes against the accepted standards laid out by Google and takes on a more unethical approach to SEO. Unethical SEO, at a glance, involves paying your way to the top of the rankings by using illegitimate techniques.

Search engines make the effort to curb illegitimate techniques as much as possible to ensure that the top-ranking sites play by the rules. However, back hat techniques are still used.

As you work on your SEO campaign, it’s invaluable for you to know and be able to identify best practices.

White Hat SEO

White hat SEO definitely takes the most effort out of all the common SEO approaches, however, it undoubtedly offers the best payoff. It involves establishing yourself as a high-ranking website organically, with legitimate clickthrough rates and dwell times.

Not only does it offer long-term ranking results, but strictly sticking to white hat approaches is the only way to remain in Google’s good books. If you want to rank on the search engine results page (SERPs) and stay there, following white hat approaches is essential.

Failure to do so could see your site getting penalised, and, in worse cases, lose its ranking power altogether. Once you’ve been penalised by Google, it doesn’t matter how strong your content, mobile friendliness, or backlinks are – getting back to a high-ranking position is next to impossible.

Typically, white hat approaches cost more and require more effort, but offer compound long-term benefits. If you aim to establish yourself as a reputable, reliable company or source of information, you’ll need to abide by white hat techniques.

 

The main components of white hat SEO include:

 

 

Create high-quality content

It goes without saying, but as a webmaster, you should be aiming to provide your visitors with top-quality content, always. Although it’s important to construct this content in a comprehensible manner for the search engine crawler, your main goal should be to provide the readers with content that is informative and worth their time.

Everything you put on your website should be original, up-to-date, and as accurate as possible. Publishing content that features keywords that not only match the search intent but also answer the question directly is a prime example of white hat content. Not only can the search engine bots identify its relevancy, but you’ve also solved the problem for the searcher.

Your content should also be of a decent length (Google favours articles with 1,000 words minimum), well punctuated, and split up into paragraphs for readability. A lot of white space around your text is not only good for the reader but also makes the content more digestible for the crawlers. Using bold to highlight important passages can also help both reader and bot.

 

 

Producing long content

The longer the content, the bigger the chance your website has of reaching the top page. More words mean more keywords, and generally more useful information that could answer search queries.

Normally, if a webmaster notices their optimised content isn’t ranking as well as they’d hoped, one of the first things they’ll do is attempt to lengthen the content. Top-ranking content tends to be at least over 1,500 words, but this depends on the length of the competition’s content.

It is always worth checking how many words your competition’s pieces contain and trying to match or exceed the figure.

 

 

Do keyword research

Keyword research and correctly using keywords make up a huge chunk of white hat SEO. Keyword research involves determining the most relevant keywords involved in specific search queries. In order for your site to appear in search results, your articles need to feature keywords that match the search intent.

Keywords inform the web browser of the topic of your articles. When crawling your web page, Google will pick up on the keywords used, determine the gist of what your content is about, and index the web page accordingly. If it determines your article to be more valuable than the competitors, who also have used the same keywords, it will bump your page ranking up.

To establish keyword data, we’d recommend using a content analysing tool such as Surfer SEO. Writing tools like this not only pick out keywords from competitor websites but also determine how many times you should use each keyword. Alternatively, you could use the likes of SEMrush to pick out the main keywords that the competition is ranking for.

In order to identify long-tail keywords, you could simply type in the root keyword into Google, which will throw up some handy keyword phrases in the form of search suggestions. There are also tools to help gain hundreds or even thousands of suggestions with minimal effort. SEMrush’s keyword magic tool is just one example. 

For example, if you’re doing an article on ‘running shoes’, Google will be able to provide you with long-tail, less competitive keywords such as ‘running shoes for heavy runners’, ‘running shoes for flat feet’, and so on.

Once you’ve gathered enough keyword data, you should spread it out evenly amongst your content, meta descriptions, anchor text, and titles to fully optimise your page.

 

 

Organise content logically

To boost your search engine rankings organically, you need to create a website that features a logical layout. From a site visitor’s point of view, they don’t want to waste time unravelling the endless drop-down menus and clicking through several links to find what they’re after. 

A simple layout reduces bounce rate, and may even encourage users to return to your site. As a general rule, no page should be more than 3 clicks away from the homepage, so there’s no need to create sub-sub-sub topics for each of your primary categories. A flat site architecture is far more preferable both to Google and to readers. 

Bounce rate is a clear indicator that something may be wrong with your site. Google will likely interpret this as a sign that you have low-quality content or that your site layout is illogical.

Additionally, a well-organised website makes the Google bot’s job easier. Data that is presented in a linked and hierarchical way makes sense to the bot as it crawls your webpage, which may lead to higher rankings.

 

 

Establishing organic inbound links

Authority is one of the main aspects of a high-ranking website. If Google does not feel your website is authoritative enough in the field of expertise it’s targeting, it will not rank your site as highly as others. The best way to establish authority in Google’s eyes is to grow a collection of quality inbound links.

Inbound links – sometimes referred to as backlinks – are links that lead from another website back to yours. The more authoritative the website these backlinks are arriving from, the better it is for your website’s reputation.

Unfortunately, establishing backlinks takes time, and there’s nothing you can do to speed it up – other than continually delivering high-quality content. You’ve got to give other websites a reason to link back to your site, so dedication to quality is key. The one thing you shouldn’t do is pay for backlinks – this is classed as black hat and could land you in trouble with Google.

Grey Hat SEO

Like the expression ‘a grey area’, grey hat SEO basically relates to any ill-defined tactics. This category lies somewhere between white hat and black hat tactics. The techniques involved have neither been approved nor denied by search engine guidelines, so their legitimacy remains unclear. This ambiguity leaves them open to debate amongst SEO experts.

A danger of using grey hat SEO is that Google or another search engine could come out tomorrow and state that it now considers a grey hat technique to be black hat. Conversely, taking the grey hat risk could put you ahead of the competition if Google redefines a grey hat technique as being white hat.

Grey hat SEO techniques are constantly changing, but common tricks that are in use currently include:

 

Purchasing expired domains

If you come across a site that is currently without an owner, you could buy this domain and gain all the backlinks and authority currently attached to the site. From here, you could build a new website on the foundations of the old one, producing a new website that ranks highly from day one.

 

Guest posting

Guest blogging is the practice of appearing on someone else’s website to write a guest post, normally to appear as an outside expert in the field. Within this guest post, the writer will include one or several links back to their own website. Not only does this increase clickthrough rate, but it can also build link authority for the guest blogger’s site.

Black Hat SEO

Unexperienced webmasters often get lured into using black hat SEO techniques by its quick-fix promise. Unwilling to take the time to work on white hat SEO tactics without the guarantee of return, many fall into the black hat trap. And more often than not, they find themselves in a worse position than they were to begin with.

Although it is a widely disapproved practice, black hat SEO remains a means of getting your page to the high-ranking sections of SERPs. It undeniably takes less time, effort, and money to conduct a successful black hat SEO campaign than it does a white hat one, and it often has a higher guarantee of results – at least for a short while.

Black hat SEO is basically a long list of shortcuts used to forgo and cheat the instructions presented in both Bing and Google’s Webmaster Guidelines. If caught taking a black hat approach to your SEO campaign, both Google and Bing will issue you with a penalty.

Penalties for black hat offences differ in severity but can range from losing your rank (and often falling to a much lower position) to being banned from the search engine completely.

The most common black hat techniques that you need to look out for include:

 

Hidden text

One of the easiest and oldest black hat tricks to perform is stuffing your page with hidden or invisible text. This relates to any text that can be read by Google spiders but is not visible to human site visitors. Generally, this hidden text is full of keywords that could not fit into the copy, deceiving the spiders that the page has more relevancy than it does.

It’s worthwhile to note that alternative (alt) text for pictures is an SEO requirement, and the only white hat ‘hidden text’ that you should include in your web pages.

 

Keyword stuffing

If you’re using such tools as Surfer SEO, it should be easy to avoid keyword stuffing. Online writing software normally informs you of how many times you should use a keyword throughout the copy. If not, it can be difficult to gauge when to stop repeating keywords, and can actually lead to your site getting penalised.

Keyword stuffing is any instance when you’ve overused a certain keyword. This used to be a more prominent black hat technique, as before, Google didn’t penalise pages who overused a word. But now, Google especially has clamped down on keyword stuffing and regards it as spam.

For the site visitor, keyword stuffing makes reading your content repetitive and even incoherent, which is why search engines strive to remove them from search results. Google will penalise sites it deems to be overusing a particular keyword – based on the length of the article – and will also punish web pages that overuse keywords in particular passages.

For example, using a keyword seven times evenly throughout a 2,000-word article would be acceptable. But using that same keyword seven times within the same paragraph would be unacceptable.

 

Paid links

Paying another webmaster to link to your content or vice versa is not permitted in Google’s guidelines. Paid links are often bought by black hat website owners in the hope of boosting their webpage’s authority. This unethical technique has given rise to link farms – websites created for the sole purpose of backlinking to other websites to boost their rankings.

Affiliating your site with a link farm is a highly risky move to make. As soon as Google uncovers a link farm, not only will it deindex it, but it will penalise all the websites that it supports.

Backlinks are a highly valued currency in SEO, but they have to come about organically in order to benefit your site in the long term.

 

Private blog networks

Private blog networks (PBN) are similar to link farms, except they are more commonly controlled by the webmaster of the site that the links lead to. Typically, a webmaster will buy lots of defunct or expired domains that still have domain authority. They will then add new content to these sites and include links pointing to the new website they’re trying to boost.

These PBNs won’t link to each other, but they’ll all lead to the webmaster’s new website, creating the semblance of high quality and authority.

 

Copied content

Before, it was easy to copy and paste content directly from another site and mask it as your own. Over the last decade, however, search engines have really clamped down on this, and now penalise even the smallest instances of duplication.

Still, some black hat rookies attempt to establish themselves as an authoritative source by using someone else’s work. This normally always gets picked up by Google while crawling the page.

Tools such as Copyscape can quickly detect plagiarised content on any website.

 

Page swapping

Page swapping involves establishing a high-ranking site using white hat techniques, only to later swap the content of the site over with un-optimised content. Once swapped over, the webmaster typically keeps the same URLs, titles, and meta descriptions in order to draw in the same level of traffic.

The replacement content is often irrelevant to the original, which makes it frustrating for page visitors. This is sometimes known as the ‘bait and switch’ technique, as it encourages a high clickthrough rate. Luckily, it’s normally easy for search engines to identify instances of page swapping based on the sudden hike in page bouncing.

 

Comment spam

This involves spamming the comment section of a blog with links back to your website. Again, this was a lot more lucrative in the past, but Google has since clamped down on this practice. Still, you’ll find shady sites piling links to their site in the hope it will increase their clickthrough rate.

Blogging SEO

The blogging search engine optimisation approach allows you to rank for informational search intents in addition to commercial and transactional. If you sell a product on your site, setting up a blog section on your website is a sure-fire way to expand your reach.

This blog should include informative articles (targeting relevant search query keywords) that are directly linked to the products you sell. In these articles, you can include links to your products, with the goal of turning informational searches into commercial ones.

To successfully set up the blog section of your website, in addition to standard SEO practices, follow these steps:

 

Construct pillar topics

Presumably, there is a range of products and services you provide, or there are many facets to the single product you sell. Before jumping in and creating blog posts at random, split different aspects of your products into topics. One way of identifying different topics would be to identify short-tail keywords related to your main product.

For example, if you sell patio doors, the short tail topics would be such things as ‘patio door prices’ or ‘patio door broken’. These short tail keywords serve as your pillar topics, which you could view as the different chapters of your blog.

 

Identify long tail topics

From these singular pillar topics, you’ll most likely be easy to find 4-5 relevant long tail topics. For example, for the pillar topic ‘patio door broken’, you could identify such long-tail keywords as ‘patio door broken lock’, ‘patio door broken seal’, ‘patio door broken glass’, and so on.

We’d recommend creating a pillar topic page that goes over the topic ‘patio door broken’ at a glance. This will contain the short tail keyword, as well as a brief overview of the subtopics you’ll also be covering in this section.

Then, create detailed blog posts about each of the related long tail keywords. In each of these articles, always link back to your pillar page via a tag in the content management system (CMS) or as part of your article anchor text. This helps Google better connect your articles and understand the subject of your articles.

 

Internally link

As much as it’s important to link to your relevant products, you should also internally link between blog posts once you’ve published a few. This optimises your site in many ways, firstly by sending visitors to other areas of your site that they might be interested in.

It also allows you to insert context for a topic without repeating yourself – all you have to do is provide a link to the relevant blog post!

Lastly, internal links spread the link juice from your older articles to your new ones. This way, if you’ve got one really successful, high-ranking blog post, it’ll be easy for you to get your other pages ranking.

 

Set up a blog posting schedule

As much as you should create structure in how your blogs are laid out, you should also create a structure to the manner in which you post them. Generally, you should be blogging at least once per week.

A blogging schedule should plan in advance the topics you wish to cover, how long it will take you to research and develop them and a consistent day and time in which to publish your blog posts.

 

Make your content as readable as possible

Although the idea of a blog may encourage you to really show off with your writing and use long and complicated sentences to get your point across, you should definitely avoid doing this. Although the majority do most of their reading on their phones nowadays, screens remain uncomfortable to read from.

Make sure your writing is concise and as spaced out as possible. Many visitors will only be looking for a quick answer to their query – not a long winding essay. Although length is important, make sure that each sentence and paragraph adds value and informs the reader of something new.

Breaking down bigger sections into easily digestible bullet points can also be a big help to the reader. Although we’ve already stressed the importance of images, using lots of graphics can really bring the text to life and make reading easier.

 

Split the article into sections

As an extension of article readability, you should also aim to split up your article into sections and subsections, laid out in a hierarchical order. This way, the reader has a better chance of finding the specific information they’re looking for.

Advanced SEO Techniques

Advanced search engine optimisation techniques, as the name implies, are methods that can elevate your web presence to the next level. Basic SEO groundwork needs to first be made in order to carry out these methods, which also require a degree of SEO expertise.

You should learn the following advanced SEO techniques if your initial optimisation campaign didn’t draw in as many site visits as you hoped:

 

Target journalistic keywords

If you’re not acquiring many backlinks through your standard SEO campaign, then you should expand your reach by including journalist keywords in your copy. Journalists need references for the facts and figures used in their articles. To uphold their journalistic integrity, this needs to come from a reputable source – which could be your content!

Prime journalistic keywords would be anything that’s currently newsworthy – a statistic or a percentage that reflects the current year and is presently relevant to the news. Even better would be a full table or scale displaying the relevant information that journalists can pull from.

Preferably, the information you display as your journalist keywords should be the result of your own research. This is why this strategy is considered an advanced SEO technique, as it takes a lot more time to gather the necessary information.

If you optimise this information correctly and your statistics reach the top of the SERPs, you can guarantee that your backlink number will increase exponentially.

 

Improve old content

Search engine optimisation efforts should never be seen as a one-time thing. In order to rank highly – and stay ranking – you need to periodically update your existing content, particularly pieces that performed well upon initial publication. Using the right tactics, you could return it to a high-ranking status.

Instead of waiting for fresh content to get crawled and indexed, you could build on pages that are already ranking for quicker results.

To determine which pages to re-optimise, check Google Analytics for conversion and organic traffic data. You could also assess click data and impressions on the Google Search Console and review current keyword rankings in Ahrefs. Generally, any content that is over 6 months old and not ranking as highly as it could is worthy of improvement.

When updating content, firstly look for areas that are clearly out of date. The internet moves pretty fast, and facts and figures can quickly become obsolete. Any fact that reflects the previous year should be updated to the present, or any fact that has been proven to be false should be changed.

If you discover that the keywords used no longer appear as prominently in search queries, you should change these over for more relevant words.

During this content audit, you should also aim to change over internal links and title tags. If you feel it is necessary, you could also swap over pictures, rewrite full paragraphs worth of content, and add additional paragraphs.

 

Remove thin content

On the topic of repurposing old content, the one type of old content that you shouldn’t improve is thin content. As aforementioned, Google prefers content that has plenty of substance. Preferably, the text on your web pages should exceed 1,000 words, but having a little under this word count is acceptable.

Anything that features 300 words, however, should definitely be deleted. It’s difficult to cover all bases of a topic in such a small article size, so thin content isn’t all too helpful to your audience. From Google’s point of view, thin content contains too little info. Search engines don’t consider thin content to be as valuable a source of information as long content.

In order to improve the ranking of your site overall, we’d recommend removing any web page that’s on the short side.

 

Appear on podcasts

Like guest blogging, appearing on podcasts is a good way to promote your own website. In the descriptions of podcasts, a link to the guest’s website is normally provided. This acts as a backlink, just as it does with guest blogs. We’ve classed this as an advanced SEO tactic as it can be incredibly difficult to get on the mic and talk over an extended period of time.

You’ve also got to come across well in order for listeners to want to check out your website. If you come across as nervous or, in worst cases, unsure of the product you are promoting, listeners may be put off from paying your site a visit.

 

Include your own images

Although including stock images can help illustrate the points you’re making in the text, it can actually be damaging to your ranking. The duplicate image experiment proved that adding stock images to your page could harm it in the same way that stealing someone else’s content can.

For this reason, we’d recommend hiring a photographer or your own graphic designer. This way, each of your web pages could feature images and graphics that are 100% original. This would also be useful for creating original graphs, particularly if you’re aiming to gain backlinks from journalists.

Having your own images and graphics would also be massively helpful in creating your own brand. Branding may not be a search engine ranking factor, but it definitely makes your site more attractive and recognisable to visitors.

 

Include a comments section

A comment section is an easy way to add more value to your page. It basically allows site visitors to expand on points made in the article or ask about aspects you’ve failed to cover. Not only does this give you ideas for improvements, but it also increases the amount of content the page features.

With a comment section, your site becomes a collaborative resource between you and your visitors. It gives visitors something else to read other than the main body of the text, which increases dwell time.

Although it’s fairly easy to add a comments section to your blog, we consider it to be an advanced search engine optimisation ranking factor as it is an ongoing commitment. In order to make sure that site visitors are using it fairly, you should have someone checking on it periodically.

Particularly, this monitor should be checking for spammy comments and removing them. The monitor could also reply to queries made in the comment section. Not only does this extend the page content level even further, but it also gives the original commenter a reason to return to the site.

A comment section could develop into a small community of users who regularly use your website resources.

 

Rank on Google Discover

Google Discover works like social media and displays websites and news articles to users in a newsfeed fashion. It appears on the homepage of Google’s Android and iOS app

The sites and articles are chosen based on the interests of the user. Like a site thumbnail, Google Discover displays short meta descriptions of each site chosen, as well as the main image of the page.

Although Google claims that the articles chosen for Google Discover are arbitrary and not based on any ranking factors, there are ways to make your site better suited for the medium.

The main thing is to feature at least one high-quality image in each article. The better the image, the more likely Google Discover users are to click on your card. Other ways that can help you to appear on Google Discover include getting Twitter engagements and providing news updates in your content, in addition to standard ranking factors.

SEO Tools & Software

Search engine optimisation is clearly a massive undertaking, with lots of specific facets that can spend lots of your time, energy, and resources. From the keyword research process to the content marketing side of things, it would be incredibly difficult to carry out any relevant tasks without using SEO tools and software. 

There are both on-page and off-page SEO tools that we’d recommend both beginners and experienced web admins to use. Besides making each individual task more understandable and manageable, they can also speed up the SEO process tenfold. 

In the following lists, we’ve highlighted the best paid-for and free SEO tools of 2022 that you should consider trialling for your own campaign purposes. For each tool, we’ve included a brief description while also pinpointing why each tool could add immeasurable value to your SEO undertakings.

On-Page SEO Tools

As we’ve already highlighted in this article, getting your on-page SEO up to scratch is crucial. Without a fully optimised page, Google crawlers may forgo indexing your page completely, your rankings may suffer, and all the content you prepared for the page will be rendered useless. 

The pages that currently appear at the top of the SERPs didn’t attain these top-ranking spots blindly; they fully optimised their page, likely with the help of some of the most cutting-edge on-page SEO tools available today. 

The following tools can help you from the get-go right up to when your content is about to go live, from aiding in keyword research to checking to see if you’ve accidentally duplicated an already-published article.

 

SEOCrawl 

SEOCrawl performs a range of routine page audit tasks and delivers the information in a speedy and detailed manner. The tools provided by SEOCrawler include SEO Monitor, SEO Crawler, SEO Dashboard, SEO Reporting and SEO Rank Tracking. Specifically, SEOCrawler allows you to check your page speed, identify duplicate content, find broken links, and even do keyword research. 

Another handy tool offered by SEOCrawler is SEO Cannibalization, which studies your website’s URLs and highlights instances of cannibalisation. It presents this data in the form of a heat map displaying the click distribution of your site. This way, you can determine whether or not pages are eating each other’s chances of ranking. 

Once SEOCrawler has analysed your page, it will present and save data on your SEO Dashboard. From here, you can see how far your page has come in terms of optimisation, and also you can identify areas that need improvement. Areas of improvement often include the identification of stronger keywords. 

Areas of improvement are best highlighted by the Site Auditor, which will essentially provide you with a list of things to do in order to improve your website. 

Instead of hiring your own in-house SEO consultant, SEOCrawler can perform all the routine tasks to help keep your website ranking and stable. Without it, you’d have to perform all of the tasks provided separately, which eats up your time and energy! 

Due to its multi-faceted capabilities, SEOCrawler has become the favourite tool of many major websites, including Reprise, Chess.com, and SeedHub Media. 

 

Copyscape 

Copyscape is a pay-as-you-go content duplication checker, which works to pull up instances of plagiarism. To use Copyscape, all you have to do is either copy and paste the page’s HTTP address into the website’s search bar, or copy and paste the entire text from the page you’re checking. 

In just a few seconds, Copyscape will display any instances it finds of plagiarism at the bottom. If duplication is identified, you can click on each individual link and visit the site via Copyscape where the instances of duplication will be highlighted. From here, you can identify whether the plagiarism is purely accidental or is a case of lifting whole paragraphs of someone else’s work. 

At the top, it will also give you a percentage of how much copying has been done. Generally, anything over 3% plagiarism is a cause for concern, but you can judge for yourself based on the information provided by Copyscape. 

The price you pay for the tool depends on how big the body of the text is. Currently, the site charges 3 cents (being a U.S.-based company) per 200 words, and an additional 1 cent extra for every additional 100 words. 

Copyscape Premium is also available for those who want to take more of a deep dive into their site’s originality. Premium allows you to check 10,000 pages at once with the Batch Search function. 

The site offers two additional subscription-based services, Copysentry Standard and Copysentry Professional. These tools help to stop other sites from stealing your content, by monitoring the web constantly and comparing it to the content found on your site. Copysentry can even help you find instances of plagiarism that have been slightly modified. It also keeps track of all the instances of plagiarism found, as well as your responses to them.

 

Screaming Frog SEO Spider 

Screaming Frog SEO Spider essentially performs the job that search engine spiders do, helping you identify any areas that need to be improved upon in your URLs. Specifically, Screaming Frog specialises in identifying broken links, determining whether correct redirects have been put in place, helping to generate XML sitemaps, analysing files, and identifying instances of duplication. 

Screaming Frog can highlight which search engine bots have crawled over which pages of your website, and detail how they react to the different aspects of your page. If certain pages of your site are not being crawled but you want them to, Screaming Frog can find out what’s wrong with them and detail what you can do to fix them. 

Examples of errors that are commonly highlighted by Screaming Frog include missing meta descriptions, misplaced redirects, and duplicated title tags. 

This tool can also find bad bots that are affecting your site’s user experience by, for example, slowing your page down. 

The basic version of Screaming Frog SEO Spider can be downloaded for free and allows you to analyse up to 500 individual URLs. The free version also helps you make XML sitemaps, audit hreflang attributes, and discover exact duplicate pages. 

A paid version of Screaming Frog is also offered, which provides you with unlimited URL analysation. This subscription-based version provides you with a bunch of other features, too, including crawl configuration, near duplicate content checker, Google Analytics integration, JavaScript rendering, custom source code search, and AMP crawling and validation. 

As of 2022, the subscription-based version of Screaming Frog costs £149 per year. 

 

Google Search Console 

Google Search Console is totally free to anyone who owns a website, so is definitely worth taking advantage of its services. It basically gives you an overview of how web users interact with your web page on Google.

On top of showing you where your site appears in the SERPs, Google Search Console also provides you with click rates, conversion rates, and data on how often you appear in search results. 

From this data, you establish site errors and areas that require attention. Google Search Console can also be used to help you identify security issues such as hacking and malware, server errors, as well as site loading time issues. 

Specifically, Search Console can identify how well your site performs on desktops and on mobile, allowing you to better understand where your searches are coming from. From this, you can gauge what areas you need to improve upon to make your site more mobile-friendly.

With this tool, you could also identify keywords that web searches are using to find your site. This could help you make appropriate adjustments to your keyword usage. 

Google Search Console is a good way to understand how both search engines and users interpret your site, helping you better organise it for increased optimisation. 

Using the features offered by Google Search Console is great for new website creators as, not only is it free, but it also helps you control what pages get indexed and which ones don’t. On the Coverage section of Google Search Console, you can submit specific URLs to the sitemap, which encourages Google to crawl it sooner. 

 

SEMrush 

SEMrush is the perfect on-page SEO tool for marketing purposes. This tool can analyse the your competitors’ content marketing strategy and provide you with insightful tips on how to overtake them in terms of ranking. SEMrush also identifies your competitor’s keywords, finds you backlinking opportunities, and performs SEO audits of your website.  

SEMrush makes it easy for you to understand the differences between you and your competitors by providing Domain Vs. Domain analysis. Through this, you can pinpoint why the competition is ranking higher than you and determine what to do about it. 

Your keyword research needs to be an ongoing undertaking if you want to get to the top of the SERPs and stay there. SEMrush helps you do this with its Keyword Magic Tool. This allows you to regularly check for changes in the ranking power of specific phrases and displays such data as cost-per-click and keyword search volume. 

SEMrush is great for expanding your keyword pool. By inserting a few root words, SEMrush will be able to throw up some useful keyword variations. This allows you to create content that matches how web users search. 

You can also gain ideas for future topics with the tool’s Topic Research feature. With this, you can find out what the highest trending subtopics currently are to your main keyword, so you can create a content plan. Topic Research also shows you how the competition responded to the topic, including what headings and keywords they used. 

There are several pricing tiers available for SEMrush, starting with Pro, which costs £99 per month. Guru costs £190 per month, while Business costs £372. Each tier increases the number of projects and keywords you can track. 

 

Surfer SEO

Surfer SEO is a handy all-in-one content writing tool which allows you to gather keywords, audit competitor sites, and structure content in an SEO-friendly way. By simply typing your main keyword into Surfer, this tool can pull up the main competitor sites, and from there pick out the main secondary keywords that you need to target. 

Once you’ve selected the competitor sites that you want to take on, Surfer SEO will determine how long you should make the article, how many times you should insert keywords, how many paragraphs to include, as well as how many headings to split up the content with. 

There is a range of Surfer pricing tiers, ranging from £40 per month for a Basic account right up to £165 per month for a Business account. The latter allows you to grow 10 separate websites and add 10 individual team members to your account.

Off-Page SEO Tools

Off-page SEO is much harder to implement than on-page SEO, and is also something you have much less control over. However, a holistic approach is needed if you want to maximise the results of your SEO campaign and see the highest rankings possible. 

Like on-page SEO, there is luckily a wealth of off-page SEO tools that can be used to help perfect your page optimisation campaign. But choosing the right tools can determine whether you outrank your competitors or not. 

Given that off-page SEO is harder to do, there’s a bigger window of opportunity than there is for excelling at on-page SEO. With this in mind, here are our top pics of off-page SEO tools: 

 

Ahrefs’ Keyword Explorer

Ahrefs’ Keyword Explorer forms an essential part of most professional content writers’ keyword research. In helping you narrow down keywords to use in your copy, it shows you how difficult it would be to rank specific phrases on a scale of 1 to 100. If a specific keyword scores close to 100, then you should consider the phrase extremely hard to rank for. If it scores 1, on the other hand, it’s an easy win. 

A lot of other keyword tools require you to first insert a seed keyword to help with analysis, while Ahrefs’ Keyword Explorer does not. By simply providing this tool with a link to your competitor’s page, Ahrefs’ Keyword Explorer can identify all relevant keywords, from which point you can compile a list of keywords to include in your copy. 

Not only does Ahrefs’ Keyword Explorer show you how many times a keyword has been searched for, but it also reveals how many clicks these keywords pull in. Click rate is really the most important part of SEO research, which makes Ahrefs invaluable. 

Ahrefs is the favoured keyword research tool as it is one of the only tools to include results across search engines other than Google. With Ahrefs’ Keyword Explorer, you can run keyword analysis across the following search engines: 

  • Google
  • Amazon
  • Bing
  • Yahoo
  • YouTube 
  • Yandex (Russia) 
  • Daum & Naver (South Korea)
  • Baidu (China)
  • Seznam (Czech Republic)

 

Ahrefs’ Site Explorer also allows you to assess backlink growth over time, identify and fix broken links, see SERP history, uncover content gaps between you and the competition, and find out what your best-performing pages are. 

Given the amount that Ahrefs’ Keyword Explorer can do, the SEO tool is justifiably expensive. For the Lite package, you can expect to pay £82 per month. The Enterprise package, on the other hand, costs £828 per month. 

 

Moz Link Explorer 

Moz Link Explorer allows you to keep tabs on your backlinks. Formally known as Open Site Explorer, the current version of Moz Link Explorer provides you with all the information you need to know regarding each backlink you have attained from other sites. 

Specifically, Moz Link explorer details the Domain Authority (DA), Page Authority (PA), URL, anchor text, link target, and spam score of each of your backlinks. Moz Link Explorer is constantly being updated, meaning you receive accurate information regarding the dates on which you gained the backlinks. 

It also details information regarding links that you have since lost, such as the specific date on which you lost them. 

Through the use of graphs and charts, Moz Link Explorer also demonstrates the growth of your own site’s DA and PA over time. This gives you an idea of how beneficial the backlinks attained have been for your website. 

Moz Link Explorer comes as part of the bigger Moz Pro package, which also includes some on-page SEO tools, such as Keyword Explorer. The tiers of Moz Pro include: 

  • Lite – £14 per month 
  • Preferred – £20 per month 
  • Elite – £31 per month 

 

SEOquake 

SEOquake is a browser plugin that provides you with data on organic searches. Specifically, SEOquake is particularly good for gathering metrics on page rank, cache date, domain age, and social media shares. Whichever web page you’re on, you can use the SEOquake plugin to perform a page audit and gather real-time results. 

SEOquake is also good for comparing several pages at once and gathering data from each to create an informative report. This report can be printed out, or exported elsewhere. 

Data sources are varied and include Google, Alexa, Bing, and SEMrush. 

This tool can appear in the form of a SEObar (SEOquake’s name for a toolbar) which remains visible and easily controllable from any webpage that you land on. Alternatively, SEOquake is also available as an SEO Dashboard, which makes this handy page analyser just a click away. 

SEOquake can also perform a basic keyword research analysis of the page if necessary. 

One of the best aspects of SEOquake is compatible with Google Chrome, Mozilla Firefox, and Opera. Better still, the plugin is totally free! 

 

Web Archive 

The Web Archive stores old data from web pages, which can be utilised for SEO purposes. Say you come across a broken link on a webpage you’d like to receive a backlink from. You could find out what this 404 redirect once was using the Web Archive, and use this as inspiration for new content. 

By creating an updated version of the article that this 404 redirect once led to, you could easier convince the webmaster to provide a backlink to the new content you’ve created. 

You could also find out other web pages that once linked to this former article and reach out to their webmasters while advertising your new replacement content. 

Web Archive is free to use. 

 

Linkclump 

Another Chrome extension, Linkclump is very simple in terms of function but can provide a lot of help when gathering several links from a webpage. As the name suggests, Linkclump clumps together all of the links found on a web page so you don’t have to copy and paste each link individually. 

Once gathered, you can choose between submitting these links to your clipboard, opening them in new tabs, or saving them as bookmarks. 

This function could be helpful in your SEO campaign if you, for example, identify a lot of broken links on a website that you’d like to replace with your own. Linkclump organises all of these links together, making it easier for you to contact the domain owner and highlight the broken links in question to them. 

This handy tool puts convenience first and is completely free to add to your browser! 

 

Check My Links 

If you’ve found a webpage you’d like to gain a backlink from, Check My Links can scan the page to identify all broken links. You could then use the likes of Linkclump to combine all the broken links identified into a neat list. 

After which, you could contact the webmaster with your list of broken links. By providing this service to the webmaster, they may feel inclined to replace the broken links identified with backlinks to your site. 

 

Talkwalker Alerts 

Talkwalker Alerts is a nice alternative to Google Alerts. It works to inform you whenever it finds your chosen search terms or brand name on the web. In terms of benefits for SEO, Talkwalker Alerts can help you keep tabs on your brand and your competition. It can also help you identify new backlinking opportunities and make sure that backlinks to your site are beneficial to your ranking.

Measuring and Reporting

Once you’ve implemented both your on-page and off-page SEO strategy, on top of periodically updating your content, you need to measure and report on the success of your campaign. Measuring and reporting your SEO is, of course, essential in determining the success of your SEO efforts, and also crucial for identifying areas that need to be improved upon. 

SEO should not be seen as a one-time thing; it requires ongoing maintenance check-ups in order to guarantee the enduring success of your campaign. Otherwise, your webpage can become obsolete and lose out to the competition. 

The main metrics you have to closely examine include: 

 

Revenue and organic conversions 

The amount of revenue and organic conversions generated by an SEO campaign is likely the most important metric that bosses are interested in. Optimising your content and getting to the top of the SERPs is not the end goal, as much as it is the purpose of an SEO campaign – the end goal is always securing the most amount of sales or getting the largest number of users to sign up for your website. 

Specifically, revenue and organic conversions could relate to: 

  • Sales – Is your website shifting units? It’s great that your website is getting traffic from your SEO, but how many of these visitors are actually purchasing your product? 
  • Account creation and e-mail sign-ups – If the visitors enjoy the content of your site and would like to hear more from you in the future, they’ll join your site by creating an account or by signing up for your newsletter. 
  • Social media shares – It can be considered a conversion when a web user shares your content on social media, as this suggests that they liked your content to the level that they want to share it with their peers. Presumably, they also view your content as reliable and will likely revisit your site in the future. 
  • App downloads – If you’ve created an app and it’s getting a decent number of downloads, then you can consider this partially due to your SEO campaign. 

 

Visibility 

Visibility refers to how many people can find your website on search results pages. It’s one of the more direct metrics of SEO, as it can be narrowed down to a single statistic. 

Visibility is a valuable statistic as it indicates your performance in terms of SERP climbing. It also allows you to estimate how much progress you still need to make in terms of SEO. It’s an indicator of how well your chosen keywords are ranking and doubles up as a means of analysing the progress of your competitors. 

There are several ways to measure visibility, including the Position Tracking Tool on Semrush. This allows you to compare the visibility of your own website with your main competitors. 

 

Keyword rankings 

Establishing relevant keywords at the start of your SEO campaign is vital, but tracking these keywords – and how well your web pages are ranking for them – throughout is of equal importance. There are several SEO tools you can use to establish and track your keywords, including Ahrefs, Google Analytics, Moz, Semrush, and many others. 

On these tools, you’ll be able to establish how well you’re ranking for specific keywords vs how well the competition is. From here, you can judge whether you’ve used the chosen keywords accordingly and whether you should change your keyword strategy. This process will also help you plan keyword strategies for the future. 

Tracking keywords is essentially the process of making sure your chosen keywords are bringing you the SEO that you forecast they would. 

 

Backlinks

We’ve talked a lot about backlinks throughout this article, but it is definitely worth mentioning again here. Backlinks are one of the biggest ranking factors on Google and other search engines. Having quality links pointing back to your site is one of the main ways to prove to Google that your site has authority in its chosen subject, as each backlink acts as a vote of confidence. 

But the emphasis has to be on quality – not quantity. Backlinks need to therefore be properly monitored over time. In monitoring, you’re making sure that the sources of the backlinks are from reputable sites, and not from untrustworthy, under-optimised sources. The latter will definitely bring your site rankings down. 

To monitor the quality of your backlinks, we’d recommend using Ahrefs, Semrush Backlink Checker, and Moz Link Explorer. 

 

Organic search traffic 

Organic search traffic relates to traffic that you receive organically through Google searches. This can be split into two categories: organic keyword search and brand keyword search. The former implies that the searcher wasn’t aware of your brand before and has discovered you through your keywords. Organic keyword search is also the most important indicator that your SEO campaign is working. 

To measure your organic search traffic, we’d recommend using Semrush Traffic Analytics or Ahrefs’ Website Traffic Checker.

 

Competitor Analysis

The problem with search engine optimisation is that there is a rigid set of rules laid out by Google that all websites must follow. Given that Google and other search engines will penalise anyone who steps outside of its guidelines, there is no room to experiment or try SEO tactics that your competitors are not using.

Your competitor pages will likely be carrying out the same on-page SEO strategies as you – performing keyword research, organising their website into silos, and creating sitemaps. For this reason, it’s incredibly difficult to not only get ahead of the competition but also stay ahead. Even if you make it to the SERPs, a competitor could easily overtake you at any time!

For this reason, competitor analysis needs to be performed periodically. This practice involves you checking out the competition’s on-page SEO to see if they’re ranking for any new keywords. You should also look into their backlink profile to see if they’ve gained any new ones. This way, you gain a greater sense of your strengths and weaknesses, while also identifying areas of your site that need to be improved.

Whenever you begin a competitor analysis, whether it is your first time doing it or not, you need to review who you consider the competition to be. Although it may have remained the same, there’s every chance that a new website has emerged and managed to rank in a short space of time.

To find out who your main competitors are, you could use a paid-for service such as Ahrefs or Semrush. Alternatively, you could use free tools such as Ubersuggest, although results may be limited.

These tools will display a list of sites that are ranking for similar keywords, from which you can pick off sites that are clearly in competition with you, i.e. offering the same service. Then you’ll be able to view which keywords they’re ranking for, how they’re using these keywords, and what kind of backlinks they’ve attained.

Once you’ve researched the top competitors, you could create new content – or reorganise older content – to target these new keywords. Also, you should judge the authority of the backlinks they’ve gathered and look to match this.

Conclusion

SEO stands alone as the most important vehicle for gaining relevant traffic, maintaining an audience, and generating revenue. As a business owner, having a well-run SEO campaign is crucial to success. 

In the past, conducting keyword research, pasting a link to your website anywhere you could, and paying for backlinks was the extent of SEO – but this is no longer the case. In order to achieve and maintain rankings across Google and other search engines, you need to follow a strict ethical path of SEO.

The best approach to SEO would be to follow the white hat approach: write high-quality, original content, conduct ongoing keyword research, build links organically, and make your site as clean and readable as possible to all who visit it. 

Throughout your SEO campaign, use a range of SEO tools, such as Semrush, Google Search Console, and Moz Link Explorer to organise and measure your efforts.

Latest SEO News:

DON'T GO JUST YET!

Our free search marketing audits are the most comprehensive and insightful in the industry – and there is no obligation to work with us afterwards! Fill in the form below to get yours!

FREE SEARCH MARKETING AUDIT