SEO Glossary

Table of Contents

0-9

3 Pack

The ‘3-pack’ are the three local Google search results most relevant to a search containing a location or request for proximity. The three businesses’ Google My Business profiles will be listed below a map showing their locations.

3 Pack SEO is focused on local traffic and is helped by having a complete, up-to-date Google My Business profile. Searches will often find this type of search result by using keyword phrases including ‘near me’.

An SEO agency will conduct keyword research to find the terms that best suit a search. For example, an Italian restaurant in Highgate, Camden, London, will see high demand for “Italian restaurant in London” or even “Italian Restaurant in Camden” search terms. However, the business will be more likely to feature in the 3-pack by targeting the specific local area by using the phrase “Italian restaurant in Highgate”.

 

301 redirect 

301 redirects are a permanent way to redirect traffic from one URL to another. The name is derived from the action’s HTTP status code.

The most common use of a 301 redirect is when companies set up new websites under a new domain name but don’t want to lose the high Google ranking or in-bound links the old URL had. The 301 redirect transfers this value to new URLs and ensures visitors can still find it.

 

302 Redirect 

302 redirects serve a similar purpose as 301 redirects but are not a permanent option.

302 redirects can be used to transfer people over to temporary pages when updating or building a new site.

Temporary pages can be used to sell products, provide information, or retrieve contact and location details while your site is down.

A 302 redirect should only be used as a temporary measure, and if a site is closing or permanently or for a long time, a 301 redirect should be considered.

 

404 error 

A 404 error code will appear if you click on a link to a URL that doesn’t exist. This occurs if a website doesn’t exist anymore or the page you are looking for has moved to a different URL.

A 404 error is relatively common because sites with outbound links might not be notified if a URL is changed.

Using a 301 redirect will minimise the risk of losing visitors that come from inbound links without having to notify the websites of the URL change.

A

AEO

Answer Engine Optimisation accommodates voice-based search queries typical of smart home devices like Google Home, Alexa, and the Amazon Echo.

Voice-based search results differ from typical search engine results as users are only able to hear one answer at a time. Search engines offer more choice as users can view a screen to select their own preferred option from the list.

While AEO isn’t a direct threat to traditional SEO, it is something that should be considered. As smart devices develop, so will voice technology’s need for accurate search results.

AI 

AI stands for Artificial Intelligence and relates to the development of technology that can differentiate between the information it receives to make a rational decision.

Traditional technology facilitates the decisions and choices we make. This development provides a better user experience and more accurate search results.

AI will also learn and develop over time, using experience to improve search capabilities and accuracy.

Agile Content Development 

ACD relates to the continual development of content to improve quality, provide value, and react to user behaviour.

Monitoring results after adding content is essential, but updating and tweaking it can often be overlooked.

Outdated content will begin to drop down SERP rankings. Agile Content Development helps to ensure content remains at a high standard and relevant to updates. This helps with SERP rankings.

ACD is achieved by carrying out four phases.

The first is to gather data and understand search behaviours and the target audience.

Strategy development is then needed to map the content needed to improve the current site.

Your content should then be optimised to provide up-to-date information, target search terms, provide relevant and up-to-date links, and improve title tags and metadata.

After implementing the strategy, the results should be measured and reported.

Ahrefs 

This is a platform used for SEO analysis by businesses and marketing agencies. It is made up of multiple tools to help people monitor and improve factors that impact SEO.

Ahrefs provides the following comprehensive analysis tools;

  • Audit – Auditing a website is an essential part of discovering strengths and weaknesses that can be targeted with a comprehensive SEO strategy.

  • Keyword research – Ahrefs provides a platform to conduct keyword research for websites to target the terms that best apply to their services.

  • Content analysis – This allows users to search for high-quality, relevant content for inspiration and understand what is working for a keyword or niche.

  • Backlink analysis – Ahrefs provides competitor backlink analysis, making it easier for sites to develop a competitive strategy.

  • Tracking – The ability to track keyword rankings makes it easier to see what is working and what needs to be tweaked.

  • Site alerts – Receive alerts every time your search engine rankings are updated or you receive new backlinks.

Alexa Rank

Alex Rank is a system that ranks websites by popularity. This analytical tool is Amazon’s first attempt to enter the market, and websites are judged by page views and unique visitors over three months.

Only websites that install the programme can be calculated, so it does not give an exact representation of market traffic.

Algorithm 

An algorithm is a complex procedure used to solve problems. The problem is solved using specified actions and a step-by-step process to provide an accurate result.

Used throughout IT, the algorithm most associated with SEO work will be the Google algorithm.

Google’s algorithm considers many factors to determine where a site will rank for specific search terms.

The Google algorithm is kept secret to avoid spammy sites manipulating the system. Google guidelines allow SEO agencies to understand the behaviours not permitted and make it easier to strengthen sites based on this knowledge.

Factors the Google algorithm takes into account when providing search results include high-quality content, a strong backlink profile, site speed, relevance, and authority.

Alt Tag

An alt tag provides a way for websites to ensure an image is visible to search engines. Images are traditionally difficult for web crawlers to identify. Alt tags provide the descriptions needed for them to be recognised and provide value.

Also known as alt attributes and alt text, alt tag descriptions can help with keyword optimisation and are written in HTML. They improve image search visibility and ensure the site is relevant for specific search terms.

Anchor Text

The text used on websites for clickable links is known as anchor text. Anchor text provides a valuable way to improve the users’ journey and makes it easier for search engines to navigate and understand the site.

There are different ways website owners can use anchor text, including;

  • Naked link – A naked link uses the URL without any anchor text to describe it.

  • Partial Match – Partial match anchor text is made up of keyword variations.

  • Exact Match – Exact match anchor text will use an exact keyword to match the page it is being linked to.

  • Branded – Branded anchor text uses the brand name of the page it is being linked to.

  • Generic – Generic phrases that aren’t related to the link brand or keywords give the user a direction, such as “follow this link” or “Click here”.

  • Image – Google will use the alt attribute of the linked image as its anchor text.

Answer The Public 

Answer The Public is a tool used for keyword research that considers the questions users might search for. Businesses can use Answer The Public to get a better idea of user intent to reach and cater to potential new customers.

Understanding the questions customers are asking is the perfect way to plan high-quality content that answers them.

Attribute rel=”nofollow” 

When linking to a page you do not want your site to be associated with, you will have the option of adding a ‘nofollow’ action. This tells search engines not to place authority on the link.

Google does not act on this request as a directive but takes it as advice.

B

BERT

BERT was a major Google algorithm update that was launched in October of 2019. BERT, which means Bidirectional Encoder Representations from Transformers, improved the ability of the search engine to understand search queries.

The advanced technology was designed to understand search intent by distinctions in natural language. The system was also able to learn and improve its service.

The aim of this was to provide users with relevant, accurate, and authoritative search results.

 

Backlinks 

Backlinks are links to a page. Google recognises backlinks as a show of confidence or recommendation from another site, and a strong backlink profile will help as the site will rank highly for relevant searches.

Backlinks have different values, and high-quality, authoritative sites linking to a relevant site will hold significantly more value than links from a new site with a poor PageRank.

Multiple backlinks from high-ranking sites in the same field will typically result in higher organic rankings.

Reciprocal linking can be beneficial to both sites, as long as they are both held in high regard and relevant to each other.

Poor backlinks can devalue a page and should be avoided. Reciprocal links with sites that have no relevancy to the other site’s field can work against them.

Irregular link patterns and links that have been bought from link farms can also result in the site being penalised.

 

Banner 

Banner advertising sees banners appear on sites with adverts for products or services from other sites.

This form of advertising can be successful because it can be used by brands and businesses whose sites you have visited to target you. They will promote the products you have been looking at or added to your basket in an attempt to get you to shop with them.

 

Black Hat SEO

SEO agencies will use various tactics and strategies to try and improve your SERP rankings. black hat SEO techniques are those that go out with the guidelines Google sets in an attempt to bypass them and enjoy easy results.

The quality of the user’s experience is often put behind achieving results when using black hat techniques.

While this can be a great way to enjoy quick and easy success, it does leave the site open to penalties.

Google regularly updates algorithms to detect these kinds of tactics, and trying to bypass the guidelines can result in Google penalising the website. Penalised websites will face a drop in SERP rankings and organic traffic as a result.

 

Blog 

Blogs (weblogs) are typically informal websites that commentate on certain interests of the blogger.

Bloggers are content creators that manage their own sites and regularly update them with informational posts. Bloggers and content creators hold a great deal of influence and can be great for businesses to work with.

A blogger can directly promote a product, business, or brand to their following, which can be a great form of advertising. Links from blogs can also have SEO benefits that are worth exploring to improve a backlink profile.

 

Blogger Outreach

Reaching out to bloggers to improve brand awareness, encouraging them to blog about a business, brand, or product, or link to a website is known as blogger outreach.

Bloggers can benefit from paid promotions or linking opportunities, or free products or services that they can then review.

A blogger with a large following and an authoritative site in a specific field can be a valuable advertising tool. Links provided can also hold a lot of value and improve SERP results.

 

Bounce Rate

The bounce rate of a site is the metric that measures how many people visit your site and leave before visiting other pages or engaging with content.

A high bounce rate shows that visitors are not satisfied with the site. Google takes this seriously as they want guests to receive the best search results.

Poor content, slow site and page speed, excessive pop-ups, and not being relevant to the search query are all reasons for a high bounce rate. Improving this can result in improved SERP rankings.

 

Branded Keyword 

Branded keywords include the brand name in the search query. SEO campaigns should have a balanced focus on branded keywords. 

 

Breadcrumb Trail 

Using navigation tools like breadcrumbs improves the user journey and is looked on favourably by Google. Breadcrumbs are a text path that notifies the user of where they are on a site.

It is recognised as providing the user with a structured and navigable way to move around the site. Breadcrumbs can be visible, downloadable plug-ins that appear near the top of a website.

Any service or tool in place that improves the user journey will typically benefit the site. They make it easier for visitors to move back and forth rather than leave the site.

C

CMS

A Content Management System allows management, editing, content updates, and control from a number of users on a website. CMS databases are relatively straightforward to set up and learn, making it easier for anyone to manage their own website.

WordPress is one of the most popular CMS that allows users to easily set up a site using pre-defined templates. Different plug-ins can be installed to personalise the site and enjoy different benefits.

CSS

Cascading Style Sheets are used beside HTML as a programming language to explain how the HTML elements of a web page will appear to users.

CSS concerns how web pages will appear, including the colours and fonts, as well as how different devices view the page.

CSS is regarded as one of the most significantly used scripting languages, along with JavaScript and HTML.

Caffeine

The Google Caffeine algorithm update was rolled out in 2010 and was designed to provide a quick, efficient web indexing service.

The new way of collecting information and crawling sites resulted in significantly quicker indexing. The new system was developed to handle the increased amount of traffic and websites that the older system struggled with.

While modernising the indexing system, Google also considered the growth of other media types and how users accessed the internet.

Call To Action (CTA)

Calls to action are prompts for users to carry out an action. A call to action can be a great way to encourage interaction and conversion on a web page.

A typical call to action can be found on most contact pages that provide a “Call now” button or something similar. A call to action should be unambiguous and clearly state the intention.

Using a call to action can be an effective way to increase an audience or make a sale.

Canonical URL 

Canonical URLs are used to advise Google which URL is preferred if there are several similar pages. This makes it easier for crawlers to gather information and signposts the URL you would like to show up in search results.

As well as displaying a preference, it helps to remove issues with duplicate content and consolidates URL information.

Clickbait 

A common tactic to encourage traffic to a site, clickbait is typically frowned upon. This is because it fails to deliver the value it promises with a sensational headline or attention-grabbing image.

While clickbait can be a great way to engage with users and encourage them to visit a site, it is regarded as a form of fraud. Though it is not punishable by law, it will typically result in unsatisfied visitors.

Poor quality content and a high bounce rate will mean that SERP results will suffer. 

Cloaking 

Cloaking is a technique that disguises the true intention of a site by ranking for other keywords and search terms. Users will click on a site only to discover they can’t find the information that drew them to it.

Pages can use coding to misrepresent the information that is crawled, tricking search engines.

This is a black hat SEO technique that will result in the page being de-indexed and receiving significant penalties if discovered.

Commercial investigation queries 

These types of queries concern the behaviour of the user and their search intent. In this case, it is when a searcher is comparing products or services and is research-based.

These queries provide search engines with valuable information about keywords and search behaviour. They are also beneficial for keyword research.

Competition 

Competition in SEO can mean two different things. The first are direct competitors for the business and website. Direct competitors will sell the same or similar products or provide similar services.

The second is SEO competitors that are challenging for the same keywords and SERP rankings.

The impact of both will differ depending on whether you are focusing on local SEO or general keywords. Direct competitors online can be based anywhere in the world. Local search SEO would only put you in competition with similar businesses nearby.

There will typically be some crossover with similar companies chasing similar keywords.

Competitor Research 

A good SEO strategy will work to improve a business’s website and SERP rankings. One of the best ways to do this is to analyse competitors in the same field that are ranking well.

Competitor research will help to provide you with valuable information that gives a far greater insight into what on-site SEO and content works.

You will be able to discover the tactics that work for your competitors and emulate them. You will also be able to pinpoint weaknesses that will make it easy for you to surpass them.

Keyword research on a successful competitor site will help you discover the keywords you should be targeting.

Understanding competitors will give you a significantly greater understanding of the field you are in. It will also give you a greater chance of disrupting the market and ranking for specific keywords.

Content 

Content is a broad term that covers the text, video, audio, and images that are used to populate the website.

Content is an extremely valuable part of SEO as it gives the user the information they are looking for. Google aims to provide the best search results to users. They try to provide relevant websites that answer the search query to the best of their ability.

High-quality content is a key part of generating organic traffic to a website. When ranking websites for specific keywords, Google insists on high-quality, unique content that provides value to the searcher.

Website content is also where you will be able to focus on keyword optimisation. In terms of SEO, the written content is still easier for search engines to crawl. This makes it a strong focus for SEO agencies.

High-quality image and video content provide value for the user. Balancing SEO-friendly content with user-friendly content is essential. You should want your website to rank highly and increase the flow of organic traffic. You should also want to make the onsite user experience and journey good enough to drive conversions and return visits.

Content Delivery Network 

Content delivery networks are the servers that distribute the content you create to the users you are targeting.

CDNs are based on location and were seen as a way of coping with the expansion of the internet in the 1990s.

Content delivery networks serve static resources and HTML from their locations, making it possible for people to receive information quickly.

Contextuality  

The content created for a website has to take into account context and the intent of the targeted audience.

Google updates have improved how they analyse the context, but maximising contextuality is still important. Contextual content will focus on a topic or product but also discuss benefits, similar products or topics and their relation to the original one.

It provides the audience with a broader description and helps Google understand the relevance concerning search intent and similar products and topics.

Conversion Rate 

Conversion rates are a way of measuring the percentage of people that have carried out a specific action after visiting your website. This could be anything from signing up for an email list or purchasing a product.

Conversion rates are an essential metric that will help you understand whether your site is successful.

When looking at website analytics, the conversion rate will let you see whether there are specific failings. Poor conversion rates can be a result of attracting the wrong kind of audience to your site. It can also demonstrate that a website offers a poor customer journey that is complicated for users to follow.

Organic traffic is essential for improving conversion rates. Ranking for relevant keywords will bring an audience looking for specific information or products. Keyword research is essential for targeting a relevant audience and improving your conversion rate.

Cookies 

Cookies collect information about your browsing habits and help to provide more relevant ads. Cookies are seen as a way to improve the user journey by improving relevance, and most people will recognise the name by agreeing to terms when visiting a new website.

HTTP cookies are used to personalise your online experience by tracking the things you look for and the pages you click on. They provide tailored advertisements for things you should be more interested in.

Magic cookies are an older and more outdated version used to transfer information, typically to log in to database systems.

Core Algorithm 

Core algorithms are seen as fully functioning algorithms that serve the purpose they were designed for. This means they will remain largely unchanged.

Google’s core algorithm plays a significant part in ranking websites. Core updates tend to focus on a specific function rather than making wholesale changes.

Tweaking and adapting the core algorithm to cope with the evolution of the internet and user habits tend to be the main reason for updates.

Crawlability

Search engine crawlers are used to navigate websites and index them accurately. The ability to crawl a site is influenced by site speed, image and video optimisation, the use of redirects, internal linking, and site maps.

The crawlability of a site is taken into account and reflects how easy it will be to navigate for users. Google recognises user experience and journey as being key factors when ranking sites. Providing an easily crawlable site shows that users will enjoy a better experience, resulting in better SERP rankings.

Crawl Depth 

The crawl depth defines which level search engines will index website pages. It takes into account the main pages and sub-pages that branch off them and how deep a crawler has to go for important pages.

Homepages will be at a depth level of 0, with pages linked off them having a depth of 1. Websites benefit from having their most valuable pages no more than three clicks from the homepage.

Crawlers

Spiders, bots, Google bot, and web crawlers are all terms used to describe crawlers. Crawlers are used to move through the internet, analyse websites, and index them.

Crawlers analyse each site’s navigation, content, and performance. This information is used to provide accurate and up-to-date SERP rankings.

Crawling

Crawling is carried out by search engine crawlers to index web pages and websites. Crawlers collect valuable information from websites to provide accurate and up-to-date rankings.

Crawling consists of the web crawlers moving through a website to explore the structure and links. Site performance and other information are then used to provide an assessment of its quality.

Cross-Linking

Cross-linking is linking from one page to another relevant and authoritative source to increase the value and relevance of both.

This is an important SEO technique that can show a page has authority by linking to other sources of information that are of high quality and provides value to the article.

The quality of the site and content you link to is an important part of how valuable the link will prove to be. The anchor text used is another important factor that will make a difference in the value of cross-linking.

A site that links to authoritative sites to add value should see improvements in SERP rankings. It can also be a good way of building relationships with other relevant websites with high PageRank. This can result in reciprocal links, helping to build a strong backlink profile.

Customer journey 

The customer journey covers everything from their experience using a website to the process of making a purchase or carrying out an action.

The customer journey can be as simple as clicking on a search result for a product and buying an item within minutes. Customer journeys can also take considerably longer and include research and comparisons.

The quality of information, the choice of products or services, price, and site performance will all play a part in ensuring the customer journey is a good one. The customer should be able to navigate a site easily and find the information or product they are looking for. Advertising and the onsite content should make a positive impression on the customer and maximise the likelihood of a return or action.

Customer Lifetime Value 

Customer lifetime value, also known as CLV, is the metric used to measure a customer’s long-term worth to a business. After a customer makes a purchase or uses a service, it is easier and less costly to try and keep them as a customer than to search for new customers.

The customer value against the average customer lifespan determines their value to a company.

Incentives offered to first-time customers, such as a discount on a second purchase, can help to build a strong relationship.

CLV can help businesses understand how much to focus on new customer acquisition and advertising.

D

Data 

Data is the broad term used for describing information. In an SEO setting, data will typically be empirical evidence, facts, and figures.

Data is gathered to analyse the performance of websites, reference other sites, and ultimately make the decisions that impact the success and popularity of a site.

Data gathering, processing, storing, and reporting are all essential factors in any SEO strategy, from planning to implementation, right through to reporting.

Dead-End Page 

A dead-end page can impair the flow of the user’s journey. It will also be recognised by web crawlers as offering no call to action, internal links, or external links.

While providing high-quality content is essential, ensuring the user journey is good and the page has an end goal is vital to the success of a site.

Deep Linking 

Internal and external linking is an important part of on-site SEO. Deep linking is when you link to a specific page directly related to the content it is linked from.

There are benefits to linking to relevant sites, and you will be rewarded by taking the time to link to the most relevant page on the site.

A deep link is also a great opportunity to link to news sites or blogs that will be more general overall but have a targeted piece about a specific topic that holds value.

De-indexed 

De-indexing is when a website is removed from Google’s index and won’t be found in search results. This has the potential to be devastating for any business that relies on organic traffic.

A site will be de-indexed if Google’s crawlers determine that a website has breached its guidelines.

As Google strives to provide high-quality, authoritative, and relevant links for search queries, it seeks to clamp down on spammy sites. Unnatural backlink profiles and spamming comments are recognised as suspicious behaviours.

Prevention is the best method in this case, and following Google’s quality guidelines will ensure your site is not deindexed.

If a site is penalised in this manner, removing spammy links and ensuring the site adheres to the guidelines will be necessary. After you remove or disavow links, you will then have to submit a request to Google for reconsideration. The site will then be checked by Google before being re-indexed.

Digital PR

Raising brand awareness will always be important for a business. This works to improve brand visibility and trust, two essential factors for your online presence.

Digital Public Relations relates to creating content that can be taken to online publishers to share. Content has to be relevant to sites, high-quality, and provide value to its users.

The result of providing high-quality content to other publishers is in the credit provided by citing you and linking back to your site.

This increases brand awareness among a new audience, recommends you as a credible source, and advises Google that you are an authoritative source.

Disavow Links 

Disavowing links is an option available to websites that want to remove bad links.

Spammy links can damage a site’s reputation with Google, and your first step should be manually requesting the link directly with the website they are coming from. This is not always possible and disavowing a link notifies Google that the link should not be counted.

Disavow Tool 

This tool allows websites to remove the value of incoming links if they are seen as low-quality. This is an essential task as sites can be penalised for irregular link patterns or links from spammy sources.

Links should go to and come from relevant, authoritative sites to hold value with Google. Poor backlink profiles can result in websites suffering in SERPs or, in some cases, being de-indexed.

Domain Authority 

A website’s domain authority, or DA, is a score that was developed by Moz, a research and SEO consulting company, to determine the relevance to a subject or industry.

The DA score of a website will be anywhere from 1 to 100, with 100 being the highest score available.

This metric predicts how a website might rank in its field based on how authoritative it is.

High-quality content that is relevant to search terms and a strong backlink profile help improve this score.

A lot of the metrics that improve a site’s domain authority will also go to improving SERP results, making it a valuable measurement for website owners and SEO agencies.

Domain Name 

Domain names are the website address. This is what users can enter into the browser for direct results should they know of the business. Businesses will use the domain name in advertising to encourage visits from potential new customers.

Domain names are an important part of your business and hold SEO value, making it essential that a strong domain name is selected.

SEO-friendly domain names should consider the following tips;

Use a top-level domain such as .com, .edu, or .org. Country or location-specific domains can also provide values for local businesses.

Domain names should also be easy to remember and type, and you should avoid hyphens and numbers.

Keywords can be beneficial, but making sure it is not relevant to the business’s service rather than using keyword-rich text is important. Overuse of keywords or search terms can be seen as spammy.

Doorway Page 

A doorway page is used to introduce other pages and contains optimised content that should rank for specific keywords.

Doorway pages aren’t as common as they once were as they are largely associated with spammy websites that sacrifice quality and value. If used correctly, a doorway page should provide value to the customer with unique content that encourages them to carry out an action.

Search engines can be cluttered with this type of page that talks about doing what the search request is looking for without providing the service or answer.

Duplicate Content

Duplicate content is when the same content is used elsewhere online. This can be on the same site or from a different site.

Duplicate content can be penalised as Google places a lot of value on original, unique content that offers the user the answer to their query.

In some cases, duplicate content is required by a business and may feature across several pages. SEO agencies will be able to canonicalise this content to direct Google to the most valuable page containing this content. The other content would then be discounted, avoiding potential penalties.

Dwell Time 

The dwell time on a website lets the business know how long someone spends on a page after being directed from a search engine.

This metric starts with the user clicking on the site and ends when they leave. This is a great way to measure how engaging a website is. If the bounce rate is high and user dwell time is low, it indicates something on the site is not working as it should be.

The goal of a high-quality website is to lower the bounce rate and encourage people to stay and carry out the intended goal.

The dwell time typically helps to reflect this. However, there may be some cases where a customer makes a quick purchase resulting in a low dwell time.

Dynamic URL 

Dynamic URLs can be used for sites that store their content on databases that are then pulled on demand. Certain characteristics are common in dynamic URLs, like ‘%’ ‘&’ and ‘$’, making them easier to identify.

Different URLs can use the same content, which is why many web admins prefer static URLs.

E

Editorial Link 

An editorial link is a high-value link from an authoritative source.

Editorial links will typically link to a site that wants to reference a well-written piece of content.

High-value links like this will be significantly more valuable to building a strong backlink profile than paid links and will assist in improving SERP rankings.

Engagement

Engagement is when users interact with the content you provide. This is one of the main results SEO and advertising campaigns aim for.

In order to enjoy a good conversion rate, customers need to click on and interact with the content provided. Whether a campaign is aimed at sales, enquiries, email sign-ups, or other targets, they all require user engagement.

User engagement can be measured to understand user behaviours and the success of the content provided. Learning what users prefer will help to improve the success of future campaigns. 

Evergreen Content

Content that is not time specific and will be relevant for the foreseeable future is considered evergreen content.

In-depth guides that deal with absolutes and range from four to six thousand words will help to establish the site as an authoritative source on the subject.

Authority and relevance are significant parts of what Google looks to reward. Providing high-quality evergreen content, as well as regularly updated blog posts, is seen as a combination that will provide success.

Updating original posts should important information come to the fore is also possible, meaning your staple pieces will still hold relevance, even if major industry changes take place.

Exact Match Keyword

Exact match keyword strategies allow the site to focus on a specific term and will typically be used in a PPC campaign.

Paid advertising strategies can be used to build brand awareness by targeting a range of keywords. Alternatively, if you want a more specific reach, exact match keyword campaigns are perfect.

Exact match keywords ensure only those that search this exact term are shown in the advert. Similar queries can result in wasted clicks and poor conversion rates if the keyword is too broad. 

External Link

External links take users to another website. They are also known as outgoing or outbound links.

While it may seem counterintuitive to direct people away from the site, Google recognises when relevant and authoritative sites are referenced and linked to each other, and it helps to build trust. This will help to improve SERP rankings.

An external deep link is when you link to a specific page on a site rather than the home page. A good deep link will help to improve relevancy.

F

FTP

The File Transfer Protocol is a system that transfers files between the server and the system. Websites without a content management system will require FTP for transferring web page files to the server file from your computer.

 

Favicon 

Favicons are more related to brand identity than SEO but should be part of a successful web strategy.

They are the small images that are seen on page tabs and drop-down menus. At just 16 pixels by 16, they are small and are typically used for basic company logos.

Brand identity and recognition are an important part of being successful online. Consistent branding across a brand’s website, social media, and advertising will help customers recognise them more easily.

 

Featured Snippets

Featured snippets are found in a box above the top result and summarise an answer for a query-related search. This can sometimes be referred to as position zero and is a great boost for a site.

As well as containing a summarised answer to a query, it will also include a link to the site it is from. This shows that Google believes the answer is the best for that specific query, and the site it comes from has authority.

This underlines the importance of high-quality, relevant information to specific search queries. Keyword research and analysing competitor sites will help businesses find the most relevant search queries for their field.

 

Follow Link/Do-follow Link 

Links that pass authority between sites are do-follow or follow links. However, do-follow links that have been disavowed will no longer pass authority to the site they are linking to.

It is important to have a strong backlink profile to improve SERP rankings. Relevant links from pages with a high DA are regarded as authoritative and hold a lot of value.

Not all links will help a website, though. Links from sites that don’t have any relevancy or paid links from link farms can damage websites. In some cases, if a website has irregular link patterns, the site can be de-indexed until the links are removed or disavowed.

 

Friendly URL 

Friendly URLs are easy for search engines to read. They will be simple and structured to show a clear path of how the page fits into the hierarchy of the site.

A friendly URL will typically be rewarded with a better SERP ranking if all other metrics are equal.

G

Geo-dependent request 

A geo-dependent request is a search query that is based on a location. This could be something like ‘plumbers near me’ or ‘plumbers in Manchester’.

Local SEO is an important part of improving organic traffic from geo-dependent requests. Alternatively, bidding for ad space in local searches can also help websites reach their target audience.

GET Parameter

Query strings, URL parameters, or GET Parameters can adjust how pages are viewed or gather data.

URL formulas can be used to order content or filter it out. Some parameters can be set to work with tools like Google Analytics to analyse data.

While it can be a great way to gather data, too many parameter URLs can be counterproductive to SEO strategies. Experienced SEO agencies will not overuse this option and risk SERP rankings.

Google

Google is a multinational tech company from America that is best known for its search engine.

According to Statista, Google enjoyed a 92.47% market share of global search engine traffic in June 2021, making it far and away the most used search engine.

After launching in 1998, Google grew rapidly and regularly updated its algorithm to ensure it provides the most accurate, relevant, and authoritative search results.

Google now has an extensive range of products and services that dominate many markets. These include Android, the mobile operating system that its devices run on, Gmail email services, Google Analytics that help websites monitor data, and much more.

Google is an innovative company that constantly seeks to explore new markets and consolidate its position in markets it already dominates. It does this by continually updating and improving services to follow new trends in technology and customer behaviour.

Google AdWords

Google AdWords is a paid advertising service offered by Google. This allows websites to appear above the organic results for a specified search term or keyword.

Advertisers submit their ads and agree to pay a set amount per click. This amount will vary depending on how competitive a keyword is.

AdWords ads appear as clearly marked as adverts. Google AdWords differs from organic search results as a business will be visible to potential customers without having to optimise its site with high-quality content.

Google’s organic results are still essential to many businesses as they are seen as being more trusted results that have earned their position rather than the ads that paid to be there.

Google Alerts

Google Alerts were launched in 2003 and provide a service that sends notifications if there is activity on search terms and keywords.

This service makes it easier for businesses to target and monitor products and services to see if there are any outreach opportunities. It also makes it easier to monitor competitors.

Google Algorithm 

The Google algorithm is a complex program that is used to rank search results and index data. It is made up of a combination of algorithms that consider a wide and varied selection of factors. The Google algorithm intends to provide the most authoritative and relevant search results to users.

Google Analytics 

Google Analytics was launched in 2005 as a way to track website performance. Analytics is an essential part of SEO as it allows website owners and SEO agencies to track important metrics that make it easier to make targeted campaigns.

Google Analytics has the most users of any analytics service available online, underlining its importance in monitoring performance.

Google Bomb

The practice of Google Bombing is when a website optimises its content and links to sites that aren’t relevant to the services or products they sell or provide.

This is a black hat technique that aims to negatively affect competitors and even make a political statement in certain cases. For example, arguably the most famous Google Bomb caused pages about president George W Bush to appear when searching for the keyword ‘miserable failure’. 

Google Bowling 

Google Bowling is another black hat SEO technique that sees agencies build a series of unnatural links to a competitor. The aim of this is to see the other site penalised for the spammy links.

Google Dance

The Google Dance is a term used to describe SERP rank fluctuations after a Google algorithm update. The changes to the algorithm could result in sites moving up and down over the course of a couple of weeks before settling back down.

Google Keyword Planner

The Google Keyword Planner is a free tool that can be used to generate keywords and understand how much Google Ads would cost for them.

Whether you plan to optimise your site to improve organic traffic or want to place advertisements, understanding the best keywords and the volume of traffic they generate is essential.

This can be a great way to discover groups of keywords that may have been overlooked.

The Google Keyword Planner isn’t completely accurate and provides general estimations. However, it does suggest unique alternative keywords that can set you apart from competitors that don’t use the tool.

Google Maps

Google Maps is a full-service mapping system that is free to use and provides a street view and directions service. Users will be able to select a mode of transport and be provided with the best directions from one location to another. It also provides an estimated time for the journey.

Google Maps are also integrated with Google My Business, meaning businesses that have taken advantage of the free service will appear on the maps. This makes it easier for customers to find and travel to businesses. 

Google My Business

Google My Business is a valuable free tool that businesses can use, like local directories. Google My Business will benefit businesses in many ways, including local or geo-based searches.

A fully completed profile offers numerous benefits and provides customers with a great resource to find directions, opening times, contact details, and the website.

Google My Business profiles linked to a website and fully verified will help a business be more visible to its local community and improve organic traffic.

Google News

Google News is tailored by its algorithm to provide a personalised feed based on the user’s behaviour and search patterns. The app allows users to select topics they want to follow and provides quick news snippets throughout the day.

Fact-checked news stories are available, as are local news and weather services.

Interacting with the services provided will help tailor it further to your behaviours and patterns. 

Google Search Console

Google Search Console is a service that web admins can use to measure site performance and traffic, resolve issues, and check to index.

Hacking, malware, and other security issues can be resolved using the search console.

While most users only check it to see user interactions, it can prove to be a valuable SEO tool when used to its full capabilities.

Google Tag Manager 

Another free application, the Google Tag Manager, is used to manage and deploy marketing tags on websites or apps without the need to modify code. GTM makes it easy to handle multiple tags and stores all the code in one handy location.

Grey Hat SEO 

If you have heard the term “a grey area”, you will understand that it is difficult to define. While white hat SEO is seen as the correct way to optimise websites, and black hat SEO is seen as using techniques that go against the guidelines, grey hat SEO falls somewhere in between.

Grey hat SEO uses techniques that are on the border of the guidelines. This can be a risky tactic as Google algorithm updates could change to mean it is no longer permitted, resulting in penalties.

Grey hat SEO can provide good search engine results and improve organic traffic. However, if it is then deemed to violate Goole’s service terms, the loss of traffic from penalties could cost a business thousands of pounds.

Guest Posting 

The reciprocal practice of guest posting typically benefits both parties.

The guest poster will benefit from the exposure of appearing on another site, as well as being able to post a link back to their site. Guest posting on an authoritative site and benefitting from backlinks will help with SERP rankings.

The site the post is added to also benefits from a high-quality piece of content that is relevant to their field. An outbound link to a relevant site is also beneficial when the site is of high quality.

Both parties should benefit from guest posting as it will improve relevance, authority, and trust in both sites.

Establishing good relationships with other relevant sites can be a great way of improving the success rate of your blogger outreach.

H

HTML

Hypertext Mark-up Language is the code used to create applications and web pages. Search engines that crawl the site read it in HTML before indexing it.

HTML is vital to SEO and is core to web development. Most technical SEO is carried out in the HTML source code. Technical SEO optimises and cleans the HTML so it is easy for web crawlers to inspect and read the site, resulting in improved SERP rankings.

 

HTTP (Hypertext Transfer Protocol) 

Hypertext Transfer Protocol transfers data between devices that are networked. Web pages are loaded by using hypertext links, with HTTP being a foundation for the internet as we use it.

 

Head Section 

When a page loads, there is a section to the top of HTML documents that won’t display in web browsers. This is the head section.

The head section contains CSS links and metadata. The head section is between the body and HTML tags allowing it to be crawled without being displayed.

Head section optimisation is regarded as technical SEO and allows page descriptions and keywords to be presented to Google.

 

Homepage

A website’s homepage is the default page that appears when the domain name is entered.

A homepage will typically provide visitors with the contents of the site, so it is easy to navigate.

Business information should explain to the visitor the purpose of the website, how to contact the company, and set the tone for what visitors can expect from the rest of the site.

 

Hyperlink 

Hyperlinks are links that take you from the text you are reading to another website or page on the site.

A hyperlink can use an image, icon, or anchor text and will make it easy for visitors to move to relevant pages or websites.

I

IP Address

The website’s online domain location is signified by its IP address. Some IP addresses will have assigned domain names, but they will typically be made up of numbers.

Several websites can share an IP address if they are on the same server. Dedicated IP addresses are shown to offer better performance, and because site speed is a crucial ranking signal, this can improve a ranking.

Image Carousels 

Image carousels provide an option for sites to show multiple images without taking up too much physical space. They can automatically switch through pictures or can be swiped through by the user.

There are questions about how effective they are with poor conversion rates in comparison to other techniques.

Inbound Link

Building a solid backlink profile is essential to most SEO strategies. A backlink or inbound link is a link coming in from another site.

Inbound links from trusted, authoritative sites are beneficial to the websites they are directing traffic to in many ways. Exposure to a possible new market is invaluable to all websites.

Google also recognises when authoritative sites link to others as a recommendation in the content or product being linked to.

The more good backlinks a site can accrue, the more trusted it will be in Google’s eyes, so it will rank higher.

Relevancy is another important part of a strong backlink profile with relevant sources linking to a site, improving its reputation.

On the other hand, poor backlinks can damage the ranking of a website. Irregular linking patterns, irrelevant sites, and paid links from link farms can all contravene the Google guidelines. This will result in sites being punished, dropping down the rankings, and seeing a reduction in organic traffic and revenue.

Index Coverage Report 

An index coverage report lets you know a URL’s current status with Google.

This is a great way to keep up to date with site performance and quickly respond to any page error messages.

Full coverage is better for smaller sites as larger pages may find less relevant pages being indexed, damaging the site’s overall ranking.

Indexed Pages 

Indexed pages have been crawled by search engines and appear in their database. New sites may have to wait for the search engine to work its way round to them, or the webmaster can request Google indexes them manually.

Sites with indexed pages will have better domain authority. Those with a good domain authority are more likely to rank for keywords and search terms.

If a site is not indexed after a long time, there may be a problem with the sitemap structure. This could lead to issues with crawling and your site not being indexed.

Indexing

The process of search engine bots crawling your website and adding it to the search engine is known as indexing. Goggle’s crawlers move through the site to understand the content and structure and rank it based on a number of factors.

Continually adding high-quality, relevant content to the site and improving the backlink profile will result in a higher ranking on the SERPs.

SERP rankings determine where the site is positioned when someone searches a specific keyword or search term. An SEO strategy should improve the site, so crawlers report back that the site is of a high standard. 

Infographic 

Providing users with engaging and captivating content is a great way to ensure they stay on your site. Infographics are an excellent way to do this. Infographics are a visual representation of data.

Infographics make it easy and enjoyable to digest information that might look boring or stale in a block of text.

Using relevant images and short pieces of information, infographics can also be used to promote brand awareness.

Branded infographics containing shareable information or details about your business can appeal to many browsers. In an age when social media is a strong form of marketing, creating shareable content is vital.

The more an infographic is shared, the more people will see your branding and the information you put out there. It also helps with visibility as social profiles that enjoy interactions across a broader audience will typically find their posts reaching more people in future.

Information Architecture

Improving the crawlability and navigation of a site is essential to both SEO and the user experience. Information architecture organises the site efficiently, so it is suitably structured.

When Google indexes a site, its crawlers have to travel around the site by following links to gather information. If Google bots cannot access certain pages, as a result of broken links, this will be relayed back to Google, where the site may rank poorly. In the worst-case scenario, the site, or at least certain pages, might not be indexed at all.

The navigability of the site is also essential to how visitors interact with it. If a site is difficult to move around because of poor internal linking or a complicated layout, visitors are far more likely to return to the search results and look elsewhere. A high bounce rate will negatively affect SERP rankings as well. 

Informational Queries 

Informational queries are searches made to gather information on a subject. An informational query is made by the user by searching for a phrase or keyword.

Search engines can categorise different search types so they can provide the most suitable results.

Different interpretations of queries will drive different results, and search engines are programmed to find the most relevant keywords or terms used.

Intent/User Intent

User intent is important for search engines and SEO strategies as it defines what the user wants to get from a search result. Understanding intent allows search engines to provide the most relevant search results.

From an SEO point of view, intent allows certain keywords to be targeted and onsite content adapted to suit the users’ needs.

Monitoring conversion rates and bounce rates for specific pages and search terms can help SEO agencies better understand user intent and behaviour. They can then optimise the pages for the keywords best suited to achieving a target.

Internal Link

Internal links are links that direct users to other parts of the same website. Internal links make site navigation easier for users, help to keep them on the site and improve their overall experience.

Proper linking strategies help with site structure and its hierarchy.

Internal linking is also an important factor for search engines and indexing. When web crawlers visit a site, they crawl through the content to give the site a rank based on a range of factors. Navigability is important when ranking a site and internal links help the crawlers move around the site and understand its structure.

Internet Service Provider (ISP) 

An internet service provider is a company you use for your internet access.

Consumers and businesses will have an ISP, and different packages will be available to suit the needs of the user. Most ISPs offer a range of connections with high-speed broadband achieved through fibre optics, satellite, and copper wire.

The largest internet service providers in the UK as of 2022 include BT, Sky Broadband, Virgin Media, TalkTalk, and Vodafone.

An ISP can also provide domain names and registration and are responsible for maintaining the infrastructure that provides their service.

Consumers and businesses will typically agree on a contract length for ISP services for a fixed price. There is a code of practice that covers the service and internet speed that should be provided. This protects businesses and consumers that enter into a contract.

J

JavaScript

Like CSS and HTML, JavaScript is a scripting language that is used in web development and SEO.

JavaScript will typically be used if interaction creates changes on a website and can improve the user’s experience.

JavaScript can make it difficult for web crawlers to crawl the site and has been known to affect site speed. Because of this, it is typically used more sparingly. This also highlights the impact when changes are made to the website through user interaction.

K

KPI 

A KPI is a key performance indicator and is used as a way to measure performance actions that have been taken or the productivity of employees.

KPIs are used on websites as a target to monitor performance. KPIs should be manageable targets that show a website is achieving specific goals.

Failing to reach KPIs show that there is an issue that needs to be resolved to achieve the results.

Monitoring progress is an essential part of an SEO strategy as it allows the agency to see what aspects of a site are performing well and which parts need further work.

KPIs for a workforce might include an expected body of work to be completed by a certain date. Alternatively, in a customer-service environment, a KPI might be to do with customer feedback.

 

Keyword Analysis 

Keyword analysis is essential to successful SEO strategies. Industry-relevant keywords should be found and checked for relevance, search volume, and quality.

Getting a mixture of long-tail and short-tail keywords will help you reach a wider audience or people with different search intent.

There is typically a lot of competition for high-volume keywords. Because of this, high-quality content, relevant and authoritative backlinks, and on-site optimisation are key to ranking well for them.

Websites should not ignore the less competitive keywords with a lower search volume. These can often provide great opportunities for a site to rank and achieve organic traffic. The benefit of this will be improved conversion rates and the ability to build authority over time as more users interact with the site.

 

Keyword Cannibalisation 

Keyword cannibalisation is a dramatic term used to describe pages for the same site that compete for the same keywords. Rather than both pages ranking highly, it can lead to diminished authority and reduce their click-through rates.

Sitemaps will allow a business to focus each page on specific keywords. This minimises the risk of pages targeting the same keywords and cannibalising one another.

An example of this would be a site advertising a plumbing service in London. If the main page targeted the keyword ‘plumbers in London’ and another page is added called ‘plumbers in London’ with links to all the different areas of London, the click-through rate would be split.

Websites that feature more than one page in the top rankings would be better to focus on one in a higher position. This would result in a higher click-through rate and also allow the other page to be optimised for a different keyword.

 

Keyword Density 

Keyword density is used to describe the percentage of content that is dedicated to a keyword.

Using a keyword the right number of times is an important part of SEO. Keyword stuffing, where the keyword is used a significant amount of times, used to be a successful tactic that could be used in SEO campaigns.

As the Google algorithm has changed over the years, it recognises this as spammy behaviour, so finding the right balance is key. Relevance and natural language are important for high-quality content, so avoiding the overuse of keywords will be important to SERP results.

 

Keyword Difficulty 

Keyword difficulty defines how competitive specific keywords are. This is an important part of keyword research as it will help to shape a campaign.

When carrying out keyword research, you will see the other sites that rank for the term. This will show how difficult it will be to rank for it and can also help to provide examples of what is successful.

In general, the more competitive a keyword, the more it will be searched for. SEO agencies will have to weigh up the time and effort expended to rank for the keyword against the rewards.

Less competitive keywords won’t have as many high-authority websites using them so they can be easier to rank for. However, they often won’t have as many people searching for them, so it’s a case of balancing up the pros and cons of targeting more or less competitive keywords based on the strength of your site and your budget for the SEO campaign. 

A combination of competitive keywords and less competitive terms should help with a website’s visibility and click-through rate.

 

Keyword Frequency 

Keyword density and keyword frequency are similar terms, with the latter showing how often the keyword is mentioned in your content.

If keyword frequency is too high, the content can appear forced and spammy. If keyword frequency is too low, it can lose relevance, so achieving a balance is an important SEO technique that can rank content.

Using synonyms and keyword variations will help with relevance and help content to rank without keyword stuffing.

 

Keyword Proximity 

When keywords are used in the content, the keyword proximity defines how close the keywords are to each other.

For example, if the keyword in question was “Manchester plumbers” a sentence like “if you are looking for Manchester plumbers” will have better proximity than “If you live in Manchester and are looking for plumbers”.

This is a relatively straightforward SEO strategy that can be managed while creating content.

 

Keyword Research 

Keyword research is core to a successful SEO campaign and is used to identify relevant keywords a site should target for a specific field.

There are a number of keyword research tools that provide the most relevant keywords for businesses and websites. You will then be able to check the search volume (how many people are searching for this term) and competition (how many sites are competing for the term).

Analysing competitor sites to see the keywords they rank for and the content they use to achieve this is a great way of finding what works. You will also be able to discover keyword gaps (where competitors are ranking for keywords that you aren’t), allowing you to target these areas.

Some keyword research tools will provide other suggested keywords and search terms to explore. This can often yield terms that competitors have overlooked and provide a great opportunity for your site to rank highly for less competitive terms.

When optimising a page for keywords, the location and reach of a business or brand are important. This can be because there are subtle differences in language.

For example, users in the UK and Australia might search for “search engine optimisation agencies”, whereas, in the U.S., they would search for “search engine optimisation agencies”.

Google has stated that this shouldn’t make a big difference to search rankings. However, it may appear unprofessional to users who might view these differences as spelling mistakes.

 

Keyword Research Tools 

Keyword research tools are the software or websites that are used to determine the most relevant and popular keywords for your website.

Tools will identify exact match keywords and long tail keywords. The reports they produce will determine how competitive the keyword is, the monthly search volume for the term, the cost per click, and other useful metrics.

Some tools identify relevant pages and keyword density. This makes it easier to optimise content and get an idea of how often keywords should be used.

Some of the top keyword research tools include Semrush, Ahrefs, Ubersuggest, and Google Keyword Planner.

 

Keyword Stemming

Keyword stemming is a search engine’s ability to understand language and intent from user queries. As the algorithms for search engines constantly improve, so does their ability to grasp the meaning behind a search and the language used.

For example, if a keyword or search term uses ‘selling’, Google then recognises sale, sold, sell, or other related terms. This makes it much easier to write natural content that flows and provides value to a reader. Content that is written specifically to crowbar keywords will appear stilted and provide less value to the user.

Google’s algorithm updates have focused on understanding language and improving the user experience. Allowing websites to write more natural content while still recognising keywords and related terms makes this easier.

 

Keyword Stuffing 

Keyword stuffing is the overuse of keywords to the point the content is difficult to read and does not appear natural.

Keyword stuffing used to be an accepted SEO technique before Google and other search engines improved their algorithms to provide search results with well-written content.

Keyword stuffing is now considered to be spammy, and a good balance of keywords across a well-written, informative, and authoritative piece is needed.

For example, a keyword stuffed piece for Manchester plumbers might read as follows, “If you need a Manchester plumber to resolve your Manchester plumbing needs, our Manchester plumbers will help.”

While this gets the point across, it is not pleasant to read. Search engines prefer content that would read as follows “If you require a Manchester plumber, our services are perfect for you.”

While they both say the same thing, the second example is more natural and less spammy.

 

Knowledge Graph

The Google Knowledge Graph is a valuable and informative concept that many of us will have seen without knowing it. The knowledge graph uses an infobox to present user queries with an answer based on the information it has gathered on the subject.

The info box appears when users search a query, and it provides a short, concise answer.

Google scans authoritative content to find the most accurate and relevant answer to a query and presents the information on the right-hand side of the page. You’ll likely come across them when searching for movies, celebrities, and companies, for example. 

L

Landing Page

The landing page of a website is the page that a visitor first arrives on. This will be determined by the search result or link the user follows to your site.

A landing page can also be used to describe a page that has been designed specifically to welcome users. They can be optimised to work alongside current advertising campaigns and are designed to engage and direct visitors.

A landing page should provide value to visitors and encourage them to carry out an action. This could be to move through to other pages, fill out a contact form, or enquire about a service.

A landing page will typically allow the website to make a good first impression and should be optimised to appeal to the user’s intent based on using specific keywords.

Link Building

Link building is the process of getting backlinks to your website – in other words, having other websites link to yours. 

Getting good quality, authoritative links to your site helps to build trust in it. They also act as a recommendation from other websites that the content on your site offers value to readers.

Links to your site might also point directly to products, showcasing your site as a trusted seller.

When Google or other search engines rank your site, they will look at a number of factors, and backlinks will be an important one.

Ensuring the links pointed at your site are good is essential. Links should come from an authoritative, relevant source. Irregular link patterns, paid links, and irrelevant links can all damage the SERP results and trust a website has.

Link building will be a big part of most SEO strategies, and there are many ways to get links from high-quality sites, including;

  • Fill your site with high-quality content that will naturally attract organic links because of its relevance.

  • Outreach campaigns will put you in touch with the websites and bloggers whose websites hold authority and are relevant to your site.

  • Industry partnerships can be built that benefit everyone involved. Speaking directly with organisations, governing bodies, and businesses in your niche can provide great results.

  • Guest posts on relevant sites will help to benefit both parties. The site will benefit from a well-written piece of content that is relevant to their niche, and your site will benefit from a link from the post.

  • Building your own network of high-quality websites and making use of online directories will help you achieve relevant links.

  • Paid links can be beneficial as long as they are of a certain standard and relevant to the source material.

Link Decay 

Link decay describes what happens when links on your site no longer link to the intended source. This can happen over time if a website changes its address, moves pages, or is taken down.

Link decay is also known as link death, link rot, and reference rot and can be detrimental to your site. When your site is crawled, relevant links to authoritative sources can benefit it. If the link going from your site doesn’t go anywhere, this can go against you.

Tools like Screaming Frog can be used to quickly check if links on your site are broken so you can remove or change them. 

Link Diversity 

Linking to other valuable sources from your website will improve your site’s reputation and authority. Ensuring you have a diverse outbound link profile is important and shows that you want to offer your visitors as much value as possible.

URL diversity can also be important with a mixture of .com, .co.uk, .org, and .edu all offering value.

Anchor text diversity should also be considered, and links should appear naturally within the content to improve the user journey and make it easier for search engine crawlers to move through the site.

Link Donors

Link donors are sites that accept donations to provide links. This is a great way to ensure a high-quality link pointing back to your site, but it can be expensive.

Before paying a donation for a link, you must check the site providing the link to ensure the site’s authority will provide a worthwhile link.

Link Echo 

Link echoes are also known as link ghosts and refer to the continued effect a link might have on a page, even after it has been removed.

Links might be removed if a page is deleted, the site has been taken down, or the content has been edited.

There is no clear-cut reason why a site might still benefit from a link that has been removed, as the Google algorithm is secret. 

Eventually, rankings can drop again, but it can take quite some time for the ‘link echo’ effect to wear off. Enjoying a strong backlink profile will mean that you won’t notice your rankings suffering too much if a few links do get removed. 

Link Equity

Link equity, or link juice, as it is also known, is the authority a strong link will pass to the site it is linking to. Search engines use ranking signals to determine how they rank each site, and the quality of links a site receives is one of them.

This value will be determined by the authority of the linking page, the relevance to the page it is being linked to, and its HTTP status.

Internal and external links provide link equity to both sites involved, and well-placed internal links also hold a value that will be rewarded.

Link Exchange

Link exchange was an outdated procedure that website owners used to get reciprocal links. The idea was to email other businesses asking for links from the other business in exchange for a link from theirs.

While reciprocal links can provide great value to both businesses if the sites have good authority and relevance, many of the links didn’t provide great value.

Eventually, Google made it clear that this tactic wasn’t particularly beneficial, and the emails died down.

Link Farm 

Link farms are not very common anymore, but they used to be a great way for websites to buy multiple links.

This was a successful way to manipulate the systems search engines had in place until algorithm updates started to weed them out.

Networks of sites were built by theme and were designed to offer ‘relevant’ links to paying customers.

After a time, the value of links that these sites provided began to disappear and eventually, they began to damage the sites they linked to.

Link Graphs

Link graphs provide visual representations of a website or URL backlink profile.

This can be a useful way to monitor and audit a site or individual pages and show the value of specific links. 

Link Hoarding 

Link hoarding is the practice of building a strong inbound link profile without providing outbound links.

A balance of authoritative and relevant inbound and outbound links is better for a site. This is seen as a much more natural linking profile and shows that you are offering your visitors as much value as possible by referring them to authoritative sources.

Sites guilty of link hoarding can appear spammy to Google and be penalised. Penalised sites lose value and drop down SERP rankings, making it essential to include relevant outbound links.

Link Shingling

Link shingling is when different anchor text is used for the same page across a number of pages. This can be advantageous, because anchor text helps Google understand the target page, and using different relevant keywords will provide greater insight into what the page is about. 

Link Spam 

Link spam involves placing out-of-context links in as many places as possible to try and increase a page’s external link number.

Blogs, forums, comment boards, and websites that allow un-moderated content are often targeted for this kind of technique.

This used to hold more value, but the development of search engine algorithms has reduced the benefits significantly, making it a time-consuming process with little reward.

Link Velocity

Link velocity is how quickly links to your site are added. It is seen as a better practice to have a steady increase over time as it will seem more natural.

The idea that a website will accrue a large number of links in a short period might point to an irregular link-building pattern, but it is not impossible for this to happen naturally. It could be that you have recently posted a valuable piece of information or news that is being linked to.

Google will check the authority of links, and only poor-quality links will be affected. A large number of high-quality links appearing will be seen as positive.

Continuing with an SEO strategy that seeks to add authoritative links over time is a great way to improve site authority, even if it is not a quick win. 

Local Search/Local SEO

The top rankings for many search terms and keywords will be completely unattainable for small businesses, but local search terms help to level the playing field.

Local SEO and local search are when the user includes a location in their query and when websites are optimised for it. Sites can create web pages designed for local areas to have greater visibility in local searches.

If a local plumbing service tried to rank for keywords like ‘plumber’ or ‘plumbing’, they would be up against every plumber in the UK. This would also include large businesses that have the budget and resources to easily beat smaller operations.

If the same plumber optimised their site for their location, made use of local directories, and completed and verified their Google My Business profile, they would stand a much better chance for local searches.

The search term ‘plumbers in Manchester’ would have significantly less competition than the keyword ‘plumbers’. Narrowing it down further to ‘plumbers in Salford’ would reduce competition even further.

Local SEO allows businesses like this to focus on search terms that are most relevant to them and enjoy local search success. Local SEO should focus on the areas that are serviced by the business, connect to GPS software, and also include optimisation for terms like “near me”.

Local search results show three relevant options below a map, allowing users to see where they are located and find out contact details, opening hours, and links to websites.

Local SEO also has better conversion rates because the user receives more accurate results that are relevant to what they need. 

Long-form Content

Long-form content is a piece of content that will be at least 1,500 words in length. There are different schools of thought that debate the minimum length of long-form content, but this should be seen as the minimum.

Long-form content allows the writer to complete a well-researched, optimised, and informative piece that offers value to the reader.

As well as providing information about a specific topic, you should also consider user behaviour and queries to ensure all their needs are met.

Keyword research will help you to structure a piece and include lots of variations without it seeming forced.

A well-written piece will gain authority from outbound and inbound links. If a piece is good enough, organic links can be a positive result of adding long-form content to your site.

Long-tail Keyword 

Long-tail keywords are made up of around three words or more. They can make it easier to narrow down a search and make it easier for Google to understand your intent.

An example of how a long-tail keyword can improve the accuracy of a search could be if someone searched for a “plumber” and had to filter through them all to find one nearby. Using a long-tail keyword like “plumber in Manchester” or even “plumber in Salford, Manchester” would help narrow down the results to provide you with a relevant option.

Long-tail keyword variations should be a part of on-site optimisation as they will include individual keywords within them, helping the site on two fronts.

Long-tail keywords allow search engines to better understand user intent and the relevance of specific sites for search terms.

M

Manual Action 

Search engine crawlers are used to determine a page’s suitability and index it. In some cases, human reviewers will check a website to ensure it follows the guidelines. If the site does not adhere to the guidelines in place, a manual action can be taken to penalise or de-index the site.

Manually actioned sites will typically receive a notification detailing the reason the action was taken. It will then be up to the website owner to resolve any issues before requesting reconsideration.

 

Mega Menu

Mega menus are typically dropdown menus that can be expanded to show the user the available pages. Mega menus can be found on retailer websites that carry a vast range of products with categories and subcategories easily accessible from the mega menu.

Mega menus will have a large number of internal links. These should not be seen as the only internal linking needed on site as other useful links will help crawlers and navigation.

The user experience will benefit from the structure provided by a mega menu that allows easy access to all the pages on the site. Web crawlers also benefit from this as it clearly shows the website’s structure and page hierarchy.

HTML can be altered by moving the menu’s HTML down to the bottom. This improves the context of a webpage with the H1 tag and first paragraphs being near the top. 

 

Meta Description 

Meta descriptions are the short descriptions that appear below indexed pages on Google. They should provide the searcher with a brief summary that entices them to click on the link.

A meta description offers an excellent opportunity to include keywords and improve page optimisation.

Meta descriptions should only be 155 to 160 characters in length, so they need to be short and snappy and include the optimised information that will appeal to readers.

 

Meta Tags

Web crawlers and search engines get information about the webpage from meta tags. This information can be used to understand the webpage, and from there, it will be able to work out rankings and search result displays. 

 

Metadata

Metadata provides search engines with the information that describes the website’s content and purpose. Metadata is an overall description of the title tags and meta descriptions on the site.

Metadata helps web crawlers easily navigate the site, resulting in better search result rankings.

 

Metrics 

Metrics are the data that is measured to monitor a site’s development and performance. There are many different metrics that need to be reported in order to achieve success.

Goals and targets that have been set will be measured by metrics, including the organic traffic that visits your site, the click-through rate from search engines, on-site bounce rate, how your site ranks for specific keywords, and the site’s domain authority.

 

Microblogging

Short-form content for platforms designed for more concise information is known as microblogging. Social media platforms like Facebook, LinkedIn, Twitter, and Instagram provide businesses and websites with another way to reach potential new customers and clients but don’t require long-form content.

Many sites that benefit from microblogging also make use of images and videos to get the attention of their readers.

While it is not as effective for SEO purposes, it can still generate organic traffic by reaching an audience, providing them with high-quality, valuable content, and linking to a landing page.

As the way users interact with and digest content, so do the techniques employed by websites to reach them. Microblogging is a great way to reach a target audience that typically watches short videos or reads brief snippets to get the information they need quickly.

 

Micromarking 

Micromarking is a way for search engines to quickly locate and review your website for indexing. This is done by websites providing microdata in either schema.org or JSON-LED formats.

Because this is a quicker way for search engines to review sites, it is looked on favourably, and this can be reflected in SERP results.

 

Mobile Speed Update

Mobile speed updates are when search engines update their algorithm to ensure sites optimised for mobile use are rewarded. Sites that offer a slow mobile performance are typically punished by lower rankings after these updates.

Search engines have adapted to offer their users the best results by factoring in user behaviours. The rise in internet access via mobile devices is taken into account, and search engines strive to provide users with relevant search results that offer a good user experience.

Slow websites will typically have a much higher bounce rate as users don’t want to wait for pages to load. Improving site and page speed will be another important part of a successful SEO strategy.

Website owners are able to test mobile site speed to ensure they can deliver a competitive service. Analysing competitor site speeds should help to set a benchmark.

Accelerated mobile pages offer a better user experience. They also appeal to search engines like Google when providing users with results on mobile devices.

N

NAP

The name, address, and phone number of a business is known as the NAP and will help with local searches. Local search visibility will rely on providing relevant information to local directories, a Google My Business profile, and the website.

Achieving authority for local searches relies on optimisation and ensuring the NAP appears in relevant places. Site footers are a great place to add a NAP and other contact details as they will then appear on each page. 

 

Natural Links 

Natural links are considered to be some of the best links a site can receive. This is because you don’t have to seek them out, and they are also a sign that your content is good enough to warrant being referenced by another site.

Building a strong backlink profile is an important SEO strategy that will improve SERP results. Actively seeking authoritative links can be time-consuming but will be worth it.

Dedicating time to providing high-quality content on your site is one of the best ways to get natural links. The more quality pieces on your site, the more likely someone will link to it to offer their users value.

 

Navigational Queries 

Navigational queries are specific searches to locate a page on the desired website. The main difference between navigational and informational queries is that the user will have a website in mind but want a specific page from it.

This type of query can be difficult to rank for as the user typically uses a brand to search. This will direct them to this site.

 

Nesting Level 

A webpage’s nesting level refers to where it sits in the site hierarchy. A website’s main page would be considered level 1, and other pages would branch off from it. The next pages down would have nesting levels of 2, and the pages that follow on from these would be level 3.

Nesting levels are important when you are trying to rank a page for a keyword. Pages shouldn’t be any more than two clicks from the website’s main page, so altering the structure may be necessary if you want a page to rank.

The further away from the main page each page is, the less important and more difficult to reach it will be deemed by the search engine. This is reflected in SERP results.

 

Network Science

Network science aims to understand how different networks work and the relationship between all the contributing factors. This academic field can be applied to the internet, other computer networks, and social media.

Network science can help us understand website rankings in comparison to others, helping with SEO strategy planning.

 

No follow

No follow links are when you link to a site but don’t want it to be recognised when indexing. This allows your visitor to still use the link, but web crawlers will not follow the link and index it.

This stops the other site from benefiting from any authority that would usually be gained from an inbound link.

This should be used if you have concerns that you might be penalised for posting a link.

 

No index 

A ‘no index’ tag is an HTML tag that asks search engines to ignore and not index specific pages that don’t offer any SEO benefits but are still needed. This can include login pages or subscription confirmation pages.

You will be able to insert a tag that prohibits all search engines from indexing the page or one that just requests Google to ignore it.

O

Off-Page SEO

Off-page SEO is work carried out outside of the website intended to improve its rankings.. Improving a backlink profile is considered one of the most important off-page SEO aspects.

A strong backlink profile will improve your site’s authority and can also help to drive traffic to the site.

Adding business details and website links to directories and completing a Google My Business profile will also improve visibility and local SEO. 

On-Page SEO

Any SEO work carried out on-site is considered to be on-page SEO. This mainly includes the content that visitors see and interact with – keywords, titles, meta descriptions, and more.

On-page SEO should improve the quality of site content, the user experience and ease with which they can navigate the site, and improve the domain score.

By optimising all the things that Google and other search engines are looking for and adhering to guidelines, your domain authority should improve. Domain authority goes from 1 to 100, and the better the domain authority your website has, the more likely it will rank well for optimised keywords and search terms. 

Ontology

Ontology is a philosophical way of thinking and aims to better understand a concept by splitting it into categories and exploring the relationships that connect them.

Ontology in SEO terms refers to the way content is created and explored to display a greater understanding of a subject or field.

While content has always had a significant focus on keywords, they should now be seen as an important factor that helps dictate the topic of discussion.

Ontological phrases can then be used to explore and display authority and knowledge on specific subjects.

Through algorithm advancements, search engines have developed to understand language and intent. Ontological thinking is now considered to be an important part of showing an understanding of a subject, and content that is adapted to show this will be rewarded.

Open Graph 

In order for social media pages to display the correct information, websites use a meta tag called Open Graph. This allows integration with social media platforms and can improve the performance of shared content.

Open Graph is not recognised as having specific SEO benefits, but it is still worth spending time on.

Social media is a valuable marketing tool that improves visibility and lets you communicate directly with your audience.

Optimising the content that you or other people share on social media improves the chances of it being interacted with and links being followed back to your site.

Social media is seen as an important tool to help with brand awareness and visibility. Every effort should be made to ensure branding is uniform across all platforms and content that is shared looks good.

Organic Results

Results that aren’t paid for are known as organic results. A good SEO strategy will improve the quality of your site and SERP rankings. Improved SERP rankings will go on to boost site visibility and your website’s organic traffic.

Paid results can be a great way to promote a business and enjoy fast results. Increasing organic traffic should be seen as a long-term plan that will improve over time.

Keyword research and creating optimised content is a big part of SEO and should help to improve SERP rankings, but a full SEO strategy should cover a number of other factors too.

A strong backlink profile will improve site authority, a metric that search engines value. All search engines aim to provide their users with the most authoritative and relevant results. Optimising a site for keywords with quality content will improve the chances of ranking and content accruing backlinks from other authoritative pages over time.

Over-Optimised

The over-optimisation of a page or website will usually be down to keyword stuffing, but other SEO techniques can be overused too.

Google and other search engines want to reward websites that are natural in the way their links and content appear.

SEO work should be seen as long-term, with improvements taking place whenever new pages or content is added. This will allow different metrics to be monitored to see what work should be carried out. This should help to minimise the risk of over-optimisation.

Another way that sites are seen to be over-optimised is by having internal and external links all pointed to a handful of top-level pages.

Like the use of keywords, linking should appear natural and balanced to show the whole site has authority. Placing too heavy a focus on one or two pages can make the website look thin on content and value.

Attempting to optimise the site for irrelevant keywords to enjoy quick wins will also result in punishment from search engines. Relevancy in the content you provide, the keywords you target, and the links you receive and send out are essential.

P

PDF 

PDF means Portable Document Format and was created in 1993 by Adobe. PDFs are used to send documents in a specific format that won’t be changed, regardless of the device used to view them.

PDFs are an excellent option for brochures as they encourage interaction, and different sections can be zoomed in on.

In terms of SEO, there are plenty of ways to optimise your PDFs to rank highly. Start by choosing an SEO-friendly title for your document, include relevant, high-quality links and ensure to optimise them for mobile. 

Page Authority

The SEO data management software Moz developed a way of scoring a website to determine how well it should rank. This score ranges from 1 to 100 and is known as page authority.

As SEO work is carried out and a site is optimised, the score will improve. This is a great way of monitoring the work you are carrying out.

You are likely to see a page authority score improve quickly to begin with, but slow down the higher it gets. This reflects how difficult it is to improve SERP rankings the higher you get.

As you try to improve SERP rankings near the top, it can get much more competitive. Because of this, improving your page authority score will require more complex techniques and will take longer.

Authoritative backlinks are a great way to improve page authority, and working on a backlink campaign while implementing other SEO techniques will help to improve your page authority score over time.

Panda 

In February of 2011, Google rolled out the Panda algorithm update to provide users with search results that promised high-quality content.

Google intends to provide relevant and authoritative results for search queries, and targeting sites with poor quality or thin content helped to achieve this.

Authoritative backlinks and sources were sought as a way to guarantee the results they were providing.

Websites were encouraged to remove duplicate content and start producing unique content that could offer value to users. Adding to existing content to improve and update it was also rewarded by this update.

Parsing 

The process of automated online information extraction is known as parsing. Web crawlers are parsers that analyse, gather, and extract valuable information that can then be stored in a database. This information can then be displayed while searching.

Parsing sees content being retrieved from sites, and it is then transformed before results are generated. Search engine bots do this to gather the information needed to index sites.

Penalty 

If a website breaches guidelines and regulations, it may face a penalty that will affect how it ranks.

Google penalties are not detailed, making it difficult to nail down exactly why they occur, but through trial and error, most good SEO agencies are able to avoid tactics that are risky.

A penalty might cause a website ranking to drop, delist sites for specific keywords, or even de-index a site.

If a penalty is applied to a site, it is then up to the webmaster to identify the cause and resolve the issue. If a site is de-indexed, you will have to request reconsideration from Google after resolving the problem.

Penguin 

The Google Penguin algorithm update released in April 2012 was created to target sites that benefited from spammy link patterns.

To ensure users were provided with high-quality results, sites that used link farms or benefited from reciprocal links with no relevance were penalised.

The update saw websites with links from authoritative sources benefit and the quality of search results improve.

Sites that had poor quality links could ask for these to be removed or disavow them so they no longer worked against them.

People Also Ask Boxes 

People Also Ask Boxes is a Google feature that provides questions and answers that relate to your query.

This improves the user experience by predicting the kind of queries that would typically follow an original search.

Google includes relevant snippets and information from a website to provide an answer and gives websites a chance to rank in top positions by providing authoritative answers to specific queries.

Personalised Results 

Google algorithm updates have improved to include user intent and behaviour. Personalised search results are based on a user’s search habits to give the user the best search query results for them.

This can make SEO more complicated for websites, but ensuring the site is optimised will still be important to visibility.

Google aims to provide accurate results based on a search query. By studying user behaviour on a macro and micro level, it is then able to rank websites based on a general consensus of what the best result will be, as well as on a personal level.

Pop-up 

A pop-up is an un-prompted window that appears on web pages. Pop-ups can be a great way to encourage people to sign up for an email list. They will often include an incentive, such as a discount on first orders, for doing this.

Overuse of pop-ups can affect a user’s experience and make it difficult and frustrating to navigate through a website. If you plan to use pop-ups, they should be used sparingly and offer value to the user.

Search engines that see a lot of pop-ups that hinder navigation can punish sites that use too many.

PPC

PPC or Pay Per Click marketing allows businesses and websites to appear for specific search terms and keywords. The cost is dictated by the popularity of a keyword and how competitive it is. The site owner will then pay an amount every time the ad is clicked on.

This is a targeted form of marketing that allows visibility for highly competitive search terms that might otherwise be unattainable.

The most popular keywords for a subject see the most traffic. Because of this, they will be targeted by most SEO campaigns for that field, resulting in a lot of competition for the top positions.

PPC marketing on the most competitive keywords can be expensive. Because of this, it is essential that onsite SEO is optimised to improve conversion rates.

Used alongside an SEO strategy that targets organic traffic, pay-per-click can be an effective search engine marketing tactic.

You will be able to set an overall budget that won’t be exceeded. You can then set other parameters to dictate when the advert will appear. This can help to ensure the campaign is highly targeted or broader.

The main platforms that offer PPC marketing include Google AdWords, Yahoo! Search Marketing, and Microsoft AdCenter.

Private Blog Network 

Private blog networks are designed to build links for a specific website or can be used to sell links in a niche market. A collection of blogs can be built up to target certain keywords or fields to appear relevant. Expired domains that still hold some value can be used and repopulated to provide a level of authority.

This is not a great way to build a strong backlink profile as they can often use the same IP address, use duplicate content, or appear too similar.

Irregular link patterns from questionable sources can see a website penalised for its backlink profile. Building links organically or through creating relationships with other authoritative sites or governing bodies will provide the most value.

Q

Qualified Lead 

A qualified lead is someone that has an interest in a service or product and expects to be contacted.

Qualified leads are the opposite of cold leads that have had no contact or shown any interest in the services provided.

A sales team will be able to contact a qualified lead to provide information and answer any questions with a higher likelihood that the lead will make an order.

 

Quality Guidelines 

Search engine quality guidelines are in place to provide guidance over what tactics can be used and which are discouraged.

Failure to adhere to quality guidelines can result in punishments that affect SERP rankings.

Black hat SEO tactics will typically try to manipulate guidelines or ignore them to enjoy better results. This is a risky tactic as it can result in penalties.

 

Quality Update

Google quality updates take place to ensure the quality of the SERP results they provide users are as good as possible.

Websites that don’t provide high-quality, up-to-date content can drop down the rankings.

Updating and improving the quality of content your site provides will result in SERP ranking improvements.

 

Query

A query is what users enter into the search box on a search engine. This can be framed as a full question or by using specific keywords or search terms.

The relevant websites that are provided by the search engine will be optimised for this query. Search engines aim to provide the most accurate and authoritative results for each query. 

R

ROI 

ROI stands for return on investment and can relate to the cost of an SEO campaign, PPC marketing, or a new website.

Businesses will have expectations in regards to their return on investment, and being able to monitor SEO results will show the improvements made.

ROMI (Return on Marketing Investment)

Where ROI is quite a general term that can be applied to any type of investment, ROMI is specific to the return on marketing investment.

Marketing campaigns must set metrics to be measured to determine the success of the campaign. The total revenue should be measured against the marketing costs. The aim is for the revenue driven from the campaign to exceed the marketing cost.

RSS Feed 

An RSS feed provided live updates from specific sources. It can be used to let users know when new information has been posted. RSS feeds are typically text-only updates but can include videos and images.

Widgets can be added to websites and blogs to feature an RSS feed.

Ranking

A website’s ranking is where it appears on a search engine for search terms and keywords. SEO is designed to improve the website in the eyes of the search engine, so it ranks higher for relevant keywords.

The click-through rate will drop significantly as you move down from the first position.

Ranking Signal

Ranking signals and ranking factors are what contribute to how search engines rank websites. The Google algorithm considers many factors and keeps these secret, so it is difficult for spammy sites to manipulate these factors.

Reciprocal Linking 

If a website links to another, and that same website links back to the original site, that is known as reciprocal linking. This can be beneficial to both pages as long as there is relevancy and authority in the link.

Google algorithm updates have improved to identify websites manipulating this system, and links should always appear natural.

Relevant Queries

Keyword research is essential to optimising a web page for relevant queries. Queries are likely to be transactional, navigational, or informational. Understanding user behaviour and intent will improve the relevancy of a site for all search engine queries. 

Response Code

Status codes or response codes let users know if an HTTP request is successful. There are many different codes, including the 301 code that indicates a redirect because a page has moved. 

Responsive Design 

Responsive design relates to how a website’s design allows users to interact with it, regardless of the device they connect through.

Responsive design is essential to the user experience. Poor design will lead to a greater bounce rate. This is an important metric that a search engine will consider when ranking a site.

Rich Snippets

Google constantly tries to improve the content it displays in search results, and rich snippets are used to provide in-depth information.

Websites must include structured data so the search engine can easily read, understand, and use information in rich snippets.

Rich snippets will improve click-through rates for search terms they are relevant to.

Robots.txt

Robot Exclusion Protocol is a file that can be used to communicate with search engine bots. This is important when asking for certain pages to be ignored and to advise how a web page should be processed.

S

Sales Funnel 

The sales funnel describes the customer’s journey when purchasing a product or service. The sales funnel helps website owners understand customer behaviour. This information is essential to marketing and site optimisation.

 

Sandbox 

Google sandbox is believed to be a filter that prevents new sites from ranking highly in search engine results pages too quickly. This helps to minimise mistakes and ensure the site is relevant and authoritative on specific topics.

Satellite domains and websites are separate websites that are designed and optimised for the sole purpose of linking back to the main website.

This is recognised as being a black hat tactic and can be risky. Focusing on improving the main site and creating high-quality content should result in natural backlinks and higher SERP rankings.

 

Schema Mark-up

Schema mark-up is also known as rich snippets and refers to the data on a web page that describes the page.

The schema mark-up helps each search engine read the web page and is included in the HTML.

Well-optimised schema mark-ups improve the site’s click-through rate, which is a metric that is considered for SERP positions.

 

Search Engine Results Page (SERP)

The search engine results page is also known as the SERP and is the page that is displayed when users enter a search query.

The top position on the SERP is the most clicked and is regarded by the search engine as being the most authoritative and relevant source of the query.

SEO is the technique used to improve positions on a search engine result page.

The results that appear will differ depending on the query and may feature Google AdWords ads, news, featured snippets, images, a local pack, map, videos, shopping results, and related questions.

 

Search History

Your search history shows the websites and web pages you have visited. This is a way of tracking your online journey and allows you to check back for specific pages or search results.

Google uses this information to better understand user habits and behaviours. This helps to provide the best search results.

Search histories can be cleared and deleted. However, Google will retain the information for research purposes. 

 

Search Robot

Search engine bots are used to crawl websites to gather information and build databases. The information a search robot gathers will help to determine SERP rankings.

It is essential that websites are designed so search bots can easily navigate the website.

Search robots can also be used by spammers to search for personal data and email addresses to exploit. 

 

Search Volume

The search volume relates to the number of searches conducted for a keyword. Keyword research will show which keywords have a high search volume and how competitive they are.

While less competitive keywords have a lower search volume, it can still be beneficial to target the most relevant ones to ensure there is a good balance and your site ranks highly for some keywords. 

 

Seasonal Trends 

Seasonal trends affect when keywords are popular and when they should be targeted.

For example, ranking first for Christmas trees in July will return significantly less traffic than ranking first in November and December.

There are tools available to track seasonal trends, including Google Trends.

 

Seed Keywords

Seed keywords are one-word keywords that can be searched on their own or used as a starting point with longer keywords and search terms.

An example of a seed keyword could be ‘shirts’. This could then lead on to “men’s shirts”, “men’s casual shirts”, “men’s blue casual shirts”, and so on. This allows users to conduct a broad search or focus on something more specific. 

 

Semantic Core 

A semantic core is the collection of phrases and keywords that describe the services or products a website provides.

The keywords within a semantic core should feature across the website to increase authority on the subject and rank for these terms.

 

Site Map

A website’s site map details the pages and structure of a site. This makes it easier for search engine crawlers to navigate and index the site.

HTML site maps are designed to improve the user experience and navigability of a site.

XML site maps provide web crawlers with the information they need to index the site.

 

Site Structure

In order for users and search bots to easily navigate a website, site structure is essential.

The site structure will typically look like a pyramid with important pages at the top leading down to smaller pages.

The ease with which search bots can find their way about a site is a determining factor in how a page will rank. A well-structured site will also reduce bounce rate and improve the user journey.

 

Social Media 

Social media is a platform for individual users to create and share content. The most popular social media sites include Facebook, Twitter, Instagram, YouTube, LinkedIn, and TikTok.

Social media is a valuable marketing tool for businesses and brands as it provides a way to interact with customers directly. Social media is a great way to drive traffic directly to your site.

 

Social Signals

Social signals are the interactions that social media profiles have with their followers. As search engines want to provide active, authoritative results for searches, social media can be a great way to show your presence.

Social signals are not as significant a factor as backlinks, but they are becoming more relevant. 

 

Source Code 

The source code of a page is the code that makes what you see on a web page possible. If you right-click your mouse and select view page source, you will see all the page’s code.

 

Status Codes 

Status codes are the same three-digit HTTP status codes that let you know whether a request to a web server is successful or why it isn’t.

 

Structured Data 

Structured data is also known as schema and, when added to the HTML of a website, helps to offer context to the web page context. This is essential to how easy a website is to crawl.

The crawlability of a website is a significant factor when a search engine decides the page rank.

Structured data or schema mark-up was standardised across search engines in 2011 and tells search engine crawlers how to access and navigate the site.

T

TF-IDF

Term frequency-inverse document frequency is the way that search engines measure how important keywords, phrases, and terms are to a website, blog, or web page.

This is the way search engines can provide accurate results and exclude sites that rely on keyword stuffing.

High-quality content that is knowledgeable and relevant is rewarded by this system, ensuring search engine users get the best results for their searches.

 

Target Keywords

Target keywords will be determined by keyword research to find the most relevant keywords and search terms that a website should rank for.

Keywords will be decided by relevance, search volume, and how competitive they are.

Content will be optimised for these keywords, related terms and phrases and include links to relevant and authoritative sources.

 

Title Tags

Title tags are the HTML element that specifies web page titles. These can be viewed on the SERP page, so they should provide a concise summary of the page content.

Title tags will also show when shared on social media and hold SEO value, so they should be unique and natural.

While there is no character limit, only the first 50 to 60 characters will be visible. 

 

Traffic

The traffic of a website is the number of people that visit your site. The metric used to measure traffic is referred to as either sessions or visits.

Other important metrics to consider with web traffic are the bounce rate and conversion rate.

Getting the right traffic to your site is essential to both the conversion and bounce rates, as people will leave the site quickly if it isn’t relevant to their search. High bounce rates will affect SERP results as search engines want to ensure websites hold value for visitors.

Unique visitors can also be measured, making it easy to differentiate between people returning and brand new users.

 

Transactional Query

Transactional queries are those made with the intent to buy a product or service. The top result for transactional queries typically sees the highest conversion rates, so they can be extremely competitive.

Keywords that indicate this intent include ‘order’ and ‘buy’. Optimising pages for these terms help websites, and pages rank for certain products. 

 

Twitter Card 

Twitter cards make tweeting links from your site a better experience for the user. The code used provides a way for users to share your onsite content in a way that is visually appealing to users.

Twitter cards allow websites to go beyond the usual character limit, include videos, view images, download apps, and visit your landing page.

Another benefit of the Twitter card is users can interact with the media without leaving Twitter.

U

URL

The uniform resource locator, or URL as it is more commonly known, is the web address your browser uses to find a specific web page.

URLs can be optimised and help to provide information to search engines and users.

A URL differs from a domain name as it can take you to any web page on a site, whereas the domain name takes you to the main landing page. The domain name will feature at the start of the URL before additional information takes you to a specific location.

 

Unnatural Link 

Unnatural links are used to manipulate link juice to improve page ranking. They can be from or to pages that aren’t relevant and will be punished by Google. Website rankings can drop significantly if they are flagged.

 

User Experience

The user experience encompasses every aspect of their contact and dealings with you. This even stretches beyond the purchase of a product or service.

How easy it is to use a website, the help provided aftercare, and even visiting the brand or business’s social profiles will all be a part of the user experience.

 

User Journey

The user journey describes the steps taken by users to find and visit your site. From there, the journey will sometimes continue to the point of purchasing a product or service.

Ensuring the customer journey is easy and informative is a key part of SEO.

A well-planned site structure and menus can help customers find their way around a site to buy what they came for or browse other products that might interest them.

A poor user journey will see lower conversion rates and an increase in bounce rates. This will negatively impact the SERP rankings.

V

Vertical Search 

Vertical search engines are also known as topical search engines and have a specific focus on their results. While Google has a broad scope that produces results for all topics, a vertical search might focus on restaurants or trades. This allows a more precise search.

 

Voice search

Using voice search technology is common among home and digital assistants like Alexa and Siri.

Voice recognition technology is used to listen to queries, and search queries are then provided.

The accuracy of voice search queries needs to be good because users are unable to see a list of options. Search engines are constantly improving to understand user intent. 

W

WHOIS 

WHOIS is a service that provides the contact details of people that own specific domain names and IP addresses.

This can help to make contacting domain holders much easier, especially if you are trying to update a broken link or remove one.

GDPR altered the way the business ran as consent must be given for a service to distribute private information.

 

Webmaster Guidelines 

Webmaster guidelines are the rules set out by Google to ensure web admins and website owners understand what they can and can’t do.

SEO techniques and tactics that adhere to the webmaster guidelines are known as white hat SEO. If the tactics used to improve a website’s SEO do not follow the guidelines, this is known as black hat SEO.

 

Website Navigation

Website navigation defines how easy it is for the user to move through a website. A website’s navigation can be improved by ensuring a good site structure has been implemented.

A poorly structured site can result in users leaving and increasing the site’s bounce rate. This will have a negative impact on SERP rankings.

Improved website navigation is also important for any search engine bot or web crawler that tries to index your site. 

 

White Hat SEO

White hat SEO is the practice of search engine optimisation that follows the guidelines set by Google and other search engines.

White hat SEO reduces the risk of penalties and negative consequences of Google algorithm updates that can impact those using black hat techniques.

Black hat SEO is when the techniques used to improve SERP rankings fall outside the guidelines and try to manipulate certain benefits to rank. 

 

WordPress

As one of the most popular content management systems, WordPress provides a platform and the framework needed for users to build their own websites.

WordPress makes it easy to buy a domain name, web hosting, and add plugins to personalise sites. SEO plugins are great for tracking site health and allowing you to focus on what can improve site visibility and performance.

Y

Yandex

Yandex is a Russian search engine that offers many of the similar services that Google does. The volatile political climate in Russia has affected the site, but it is still the most visited site in Russia.

Yoast SEO 

The Yoast SEO plugin for WordPress websites makes it easier for website owners and web admins to optimise their sites.

Like WordPress, Yoast SEO is user-friendly and provides a rating for each page from 1 to 100 based on the use of keywords in the content, title tags, headers, metadata, and alt-text.

Comments

John has 10 years experience in SEO & PPC, traditionally working with clients in B2C sectors. His creative approach to search marketing has also had him shortlisted for SEO campaign of the year in 2021. John enjoys watching Liverpool FC and going to festivals. 

Article Information

Latest Google Updates

Social Media

DON'T GO JUST YET!

Our free search marketing audits are the most comprehensive and insightful in the industry – and there is no obligation to work with us afterwards! Fill in the form below to get yours!

FREE SEARCH MARKETING AUDIT