WolfGang Logo
Wolfgang Digital SEO Glossary: Busting the Jargon

seo glossary the words that fuel an seo company

Over the years, SEO has grown to become a key digital marketing channel for many brands and companies. The SEO dictionary has also swelled over the years with jargon and SEO specific terminology emerging on an almost daily basis. As a digital marketing agency that prides itself on communication, we've been unbamboozing the world of online advertising for our clients and community through blog posts, speaking events and KPI studies. This SEO Glossary, SEO Jargon Buster, SEO Terms Overview, Meaning of SEO Terms, SEO Term Dictionary (see also: keyword stuffing) is the more direct way to bust the jargon.

# | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z

Numbers Are Generally: Response Codes

HTML response codes are informational status codes indicating whether a specific request has been successfully completed. There are five classes: successful responses, server errors, redirects, informational responses and client errors.

200 - This is a success message and it indicates that the action requested by the client was sent and processed successfully. The most common use is when a web page is successfully served up.

301 - Website redirection can be temporary or permanent. 301 is a permanent redirection message and it happens when you are redirected to another URL after visiting a website or page. 301 messages tell search engines that the page the user is trying to access has been redirected and page ranking for the site will be moved to the new address. This is only applicable to 301 redirects.

302 - 302 Found is a temporary redirection message. Similar to the 301, this status code indicates that a web page has been temporarily redirected to another page. Unlike a permanent redirection message, search engines will not pass any ranking to the new page because it is assumed that the redirect is only temporary.

307 - The 307 status code is a temporary redirect message used in HTTP 1.1 protocol. HTTP 1.1 has a required Host header and 307 message is used to temporarily redirect URLs without passing any link value.

404 - This is a “Not Found” status code and is common when a web resource is missing. You should remove any missing web pages and ensure that you create a redirect for all deleted pages on your website. It is recommended that website owners monitor any 404 errors so that the missing pages can be redirected (301) to appropriate content.

410 - A 410 status code tells the client that the web page was removed. When a resource has been explicitly removed, a 410 is used to alert search engines and serves as an instruction to remove it from their index. This is a powerful status code, so use it only when you absolutely want particular URLs removed from search results.

418 - This status code was defined in 1998 as an April Fools joke and is not expected to be used by actual HTTP servers.

500 - A 500 error message means that even though the request is valid, the server was blocked from carrying out the request because it encountered an unexpected issue. This status code leaves search engines confused because it does not provide specific information on what caused the error. If your website is returning 500 errors, you must investigate the cause and fix it.

503 - An overloaded server or an outage can result in a 503 error. It tells the client that the server is currently unavailable. This code should be used when a server is undergoing maintenance, so search engines know that the resource will be available if they try later.

A is for AMP

Absolute URL - This is when a complete URL of a web resource is used rather than just a part

(a relative path). Absolute URLs are preferred for SEO because they do not create canonical problems.

  • Absolute URL example: https://www.wolfgangdigital.com/about-us/careers-with-wolfgang-digital/
  • Relative URL example: /about-us/careers-with-wolfgang-digital/

Accelerated Mobile Pages (AMP) - AMP is an Open Source framework created by Google and Twitter to help speed up the loading times on mobile devices. It is essentially a stripped-down form of HTML used in creating really fast mobile pages.

AJAX - A web development technique that allows users request data without loading a new page. In summary, it uses a XMLHttpRequest object to communicate with servers. Messages are sent and received in different formats like XML, JSON, HTML, and .TXT files.

Alt Attributes - An alt attribute makes it possible to input an alternative description for every image on a website. If an error occurs and the image does not appear, the alt attribute provides a description instead. Search engines rely on alt attributes to identify what an image contains and this can affect the content indexed.

Anchor Text - An anchor text is the clickable text of a hyperlink. In some online content, writers may want to send the reader to another page or external resource. The easiest way to do this is to add a URL (hyperlink) to a text. This text is the ‘anchor text’ that holds the link.

Affiliate Site - Web sites that promote the products and services offered by 3rd party websites are affiliate sites. Affiliate marketing is a popular marketing program that allows independent marketers (affiliates) to get paid for referring their website users. Affiliate websites are embedded with tracking codes and earn revenue when their users visit or make a purchase through their referral links.

Analytics - It is a software which provides insights for website owners. It gathers information about the website's traffic, geo-location of visitors, bounce rate, keywords, organic search, and lots of other valuable information.

Angular JS - AngularJS is a JavaScript front-end web applications framework chiefly maintained by Google and a community of developers. AngularJS allows developers extend regular HTML for application development. The resulting environment is remarkably expressive, readable, and faster to develop.

API - Application Programming Interface (API) is a set of processes, protocols, and tools used in creating software applications. Many software services or information-based products use APIs to access remote data. API is used to integrate third-party software into many custom applications opening access to features or data available on an operating system, application, or another service.

Authority - Authority means trust. Being an authority online means that a website or page has a high trust value. Each time people search for something and end up on a page, the website is credited with their trust. Authority can also be earned when other external web content links back to your URL or page.

Back to top

 

B is for Bounce Rate

Bing Webmaster Tools - Bing Webmaster Tools is a free website management service offered by Microsoft as part of its search engine, Bing. The web resource allows webmasters to add their websites to the Bing index crawler. It also helps them identify any problems with their web pages, and provides suggestions on how to fix these issues so that they can perform better in search engines.

Backlinks - This simply refers to all inbound links from other websites to your site. In the world of SEO, the quality of backlinks is dependent on the reputation and page rank of the linking site. So many experts recommend link juice from high-quality sites and advise webmasters to shun spammy website backlinks.

Broken Links - Any URL that isn’t working as intended is a broken link. This can happen when a website goes offline, content is moved to a different location, or for many other reasons. The occasional broken link will not affect SEO, but several broken links can create a negative signal to most search engine algorithms.

Black Hat - SEO techniques that don’t fall within a search engine’s approved guidelines are called a black hat SEO techniques. These black hat methods are usually considered deceptive in nature, but can also be used to exploit a search engines algorithm in order to gain higher ranking results in a short time frame.

Bot – A ‘Bot’ is an active script or program that can perform certain automated tasks. Robots, spiders or crawlers are the most common bots used today. Search engines use bots to search for and add sites to their search indexes.

Bounce Rate - Your website bounce rate is the percentage of website visitors who leave without visiting any other pages. If you have Google Analytics setup for your website, you can view your bounce rate in detail and use the data to make effective targeting decisions for a lower bounce rate.

Breadcrumbs - Hansel and Gretel used breadcrumbs to find their way back home. This is the same ideology used in websites design today. A breadcrumb is a user-friendly feature that helps website users understand where they are on a site. It will help them navigate easily so they can find their way back to the homepage of the website or back to where they started from.

Back to top

 

C is for Crawlers

Cannibalisation - Keyword cannibalisation occurs when several pages on your website are competing for the same keyword or phrase in search results. Having too many pages competing with each other, or trying to rank for the same keyword is not good for sites SEO and should be one of your key website optimisation tasks.

Cloaking - The method of displaying different content to search engines from what is on the website’s page. It can also be used deceptively, such as displaying a “quality” page to Google but only serving affiliate advertisements to users who visit the page. Cloaking falls under the ‘Black Hat SEO’ technique.

Competition - Businesses usually know who their competitors are, but these companies may not be the same ones out-ranking your website in the search results. Your competition is that websites that can be found when your chosen keywords are searched by users. For example, all ‘hairdressers in Dublin’ listed in the search result are competitors for the same online space.

Content Relevance - We all know ‘content is king’!, but if the content is not relevant to the keyword we want to rank, it taints users’ experience and the page’s ability to rank well. The relevance of a topic to the content on the page is of great importance for search engine optimisation. The idea is to make sure that the content of a page matches the search query so that users can view only content that is useful, making their experience richer.

Content Relevance is one of the main ranking factors that Google uses for website evaluation, rankings and indexing.

Crawlers - A search engine’s crawlers are responsible for discovering new web pages and adding them to the search engines index. Ensuring that your website is visible to crawlers is important for any website that wants to be found and indexed by the Search Engine.

Crawl Budget - This is the number of pages Google crawls on your website every day. Google states that crawl demand and crawl rate are the two factors that make Google Bot’s crawl budget for every website. The size of a website, health (errors encountered) and the number of links to your website are elements used in calculating crawl rate.

Cross-Domain Canonical - The canonical tag can help you resolve issues with duplicate content. It lets Google know that the pages with tags are copies of the original page. Cross-domain canonical tagging can be done across multiple domains to let search engines know where the original content came from, and to give credit to the content owner.

Canonical (Canonical issues, canonical tag, canonical URL) – Canonical issues refers to duplicate content problems. It arises when 301 redirects are incorrect causing your website to be accessed by search engines from multiple URLs. Search engines could potentially index your site under several different URLs, making it look like you have duplicated content. It can be resolved by using a noindex meta tag and proper 301 server redirects.

A canonical tag can be added to duplicate content pages to ensure that credit is given to the original page.

CMS (Content Management System) - A CMS is a software application that lets website owners modify the content on their websites from a user-friendly dashboard. The user does not necessarily need any scripting language knowledge, as Content Management Systems are designed to be easy to use.

The most popular CMS examples include WordPress, WebEx Panel, Drupal, Joomla and Magento.

Content - It is the information published on web pages for user engagement. It can be an image, video, document, or text content. It can be original or curated (information gathered from different websites). When content is copied without authorisation, it is regarded as plagiarised content. Search engines frown on plagiarism and could penalise websites with these types of content.

Comment Spam - You will get spam comments frequently if you own a blog and have an open commenting system. Comment spams are posts or stories that have absolutely nothing to do with the original content. Spam is posted to encourage users to click and be directed to a completely different site or link.

CSS (Cascading Style Sheet) - CSS is a style sheet language or code used to describe the distinct sections of your website. It focuses on defining the look, structure and flow of the website. With CSS, developers can specify the colour of headers, size of menu bars, font type and sizes etc.

Back to top

 

D is for Domain Authority

DOM (Document Object Model) - DOM is an application programming interface (API) for HTML and XML. It defines the structure of documents and how the content is accessed via API calls. DOM connects web pages to scripts or programming languages.

Domain Authority - This is a metric created to predict the ranking ability of a website or domain in the search engines. A high Domain authority score indicates that your website or domain has the potential to rank very well in search engine results.

Doorway Page - Doorway pages are created to rank high in SERPs for specific search terms in order to improve website’s traffic. Doorways or doorway pages are considered as blackhat SEO technique and website can be penalised for using one.

Duplicate Content - Content that is the same or similar what is found on other websites or internal pages on your site is considered as duplicate content. Plagiarised content is not SEO-friendly and will cause websites more grief than good.

Directory - A directory is a categorical compilation of websites. A high-quality directory is manually curated by experts and makes it easy for listed websites to rank.

Divs - The ‘div’ tag is very often used within a CSS code to structure a web page. The ‘div’ element is just a container unit that encapsulates other page elements and breaks an HTML document into different sections.

Disavow File - The disavow file lets websites remove any unwanted or harmful inbound links from disreputable sources. This tells search engines that a webmaster does not approve of the source, and helps combat negative SEO or penalties that may arise from the unfavourable inbound links.

Deep Linking - These are backlinks to internal pages. For example, https://www.wolfgangdigital.com/blog/P24/  is a deep link because it does not point to the home page, rather is a direct link to an internal page on our blog. Deep links can be great signals to help promote internal pages in the search results.

Developer - A web developer is a programmer who specialises in developing web or mobile applications with a scripting language like Python, JavaScript, HTML, PHP etc. They are sometimes confused with web designers. A web developer creates code and develops applications like websites and mobile apps. A good dev who understands SEO and other digital marketing concepts is worth his/her weight in gold!

Do-follow Link - A ‘Dofollow’ link is a standard HTML link that does not have a rel=”nofollow” attribute. They are very valuable in SEO and allow search engines to follow them to your web page.

Domain - This is the customisable or branded element of a website’s name. For example, in the URL: http://www.yourdomain.com/folder/file.html, “yourdomain” is the domain name. It is how users find a website and can be regarded as your ‘real estate’ property address online.

Dynamic URL - A dynamic URL is the reverse of a static URL because it is the result of a database query and is dynamic in nature. This type of URLs helps pass all kinds of commands, functionality and data through a remote server. Dynamic URLs usually contain the following characters: ?, %, $, &, =, +, cgi-bin, .cgi. and many more.

Back to top

 

E is for E-commerce

E-commerce - These types of websites are devoted primarily to online sales. Short for “electronic commerce” it is a website where products and services are sold to web users.

Back to top

 

F is for Featured Snippets

Faceted Navigation - Also known as faceted search or browsing. This is a method for accessing data organised according to a faceted classification system, enabling users to explore data collections by applying multiple filters.

Faceted navigation is an important search innovation and offers a great search experience to users who start off with a classic keyword and receive a list of relevant results.

Flash - Very popular in the 90s, Flash is an interactive media technology that makes websites more interesting. Right now though, using Flash on your website is not an SEO-friendly practice as it can kill your search rankings. This is because Flash cannot be indexed so you will need to employ certain optimisation techniques to use Flash.

Frames - Same as Flash, frames are an SEO burden. When you use frames on a web page, you will be using the same URL but displaying different content from an external source. This can be confusing for search engines, resulting in web page not to get indexed.

Featured Snippets (Position Zero) - Also called position zero, featured snippets are organic search results that appear at the top of the search listings before the first organic search result. It is usually an answer to a search query.

Google’s featured snippets are a powerful opportunity for websites with valuable content to make the No.0 organic rank for a keyword.

The Fold (Above the fold) - The Fold is the point on your site where a page is cut off by a display monitor or browser window. Users cannot view this area until they scroll down, and search engines will give value to content that is above the fold area.

Back to top

 

G is for Google Search Console

Google My Business - This is a free and easy-to-use service for businesses to help them manage their online presence. It comprises of tools like Search and Maps. When you verify and editing your business details, customers can easily find you.

Google Plus - A social network by Google with features from existing services like Picasa. Google Plus is similar to other popular social networking sites and can help your SEO ranking improve. There is a strong correlation between rankings in Google SERPs simply by having a company profile on Google Plus.

Google SERPs - Search Engine Result pages (SERPs) are a list of web pages served to users when they input a search query in Google search listings. When a website is listed at the top of Google SERPs it can receive organic traffic and increase it ROI tremendously.

Google Bomb - These are techniques that are used to change Google search results for a satirical or humorous purpose. For example, typing in the keyword “miserable failure” and coming up with the results of a politician’s name. This can be achieved by creating many links to the website from pages with the search term.

Google Bowling - When a website ranks low because it purposely received bad links, it is called Google Bowling. It is an unsavory way to manipulate the external ranking of a competitors website. However, with solid backlinks, a website can overcome this attack.

Google Dance – There are was a time when the SERPs are was disrupted by Google’s ranking algorithms activity. This caused a major disruption in the Google algorithm and a huge shift in the index. During this period, rankings were rebuilt and the SERPs fluctuated for several days.

Google Bot - Another name for a Google bot is the Google spider. It is the robot that scans the internet, indexing pages for inclusion in Google’s search listings. You can see how your website looks to Googlebot using a Search Engine Spider Simulator tool.

Google Knowledge Graph - The Knowledge Graph is a database used by Google to improve its search engine results with semantic-search data gathered from a wide variety of resources. Google’s Knowledge Graph was included in the search engine in 2012 and connects facts, people, businesses and places. The Knowledge Graph provides search results that are more notable and visually appealing.

Google Analytics - Google Analytics is a free web service offered by Google to monitor, track and manage website traffic. Analytics is the most used website productivity tool and provides insights for actionable data and decision-making. You can measure your marketing campaign results, track visitors based on geo-location and generate reports to help you discern your traffic source.

Google Search Console - The search console was previously known as Google Webmaster Tools and is a free webmaster service by Google. It allows website owners check errors, indexing status and optimise their site for better visibility.

Google Algorithm Updates - Google updates its search algorithm frequently and this means that webmasters have to stay updated too. While most of these changes are trivial, Google frequently rolls out critical algorithmic updates (Panda and Penguin) that affect SEOs in significant ways.

Back to top

 

H is for Headings

Hreflang - This is used for language and regional URLs. Websites serve users from all over the world with content created for certain regions. By using a “hreflang” attribute, the correct content is served to the appropriate region.

Headings - A heading is an HTML element intended to briefly describe the content on a webpage. Headings range from H1 to H6 and are an SEO best practice. The heading tags (H1 to H6) are used to separate the content on a page. Keywords are also placed in heading tags to give them more weight than other words.

HTML - HyperText Markup Language (HTML) is a very popular language for Search Engines. It is the most commonly used language in web and mobile development and is usually the first language learned by newbie developers. It is the fundamental scripting language used to define the structure of a web page.

HTML Site Map - HTML sitemap should list all the important pages on the entire website. An HTML sitemap allows website visitors to easily navigate a website and discover pages that are not linked from the main navigation. Using the search engines can crawl a website and index all of its pages, including any URLs that are hidden from users.

HTTPS - A Hyper Text Transfer Protocol Secure (HTTPS) is the encrypted version of a regular HTTP address. This is the protocol that is used to deliver data to your browser and is part of your website URL. The ‘S’ at the end of HTTPS means that the website has been secured with SSL (Secure Sockets Layer) and indicates that communication between your browser and website is encrypted.

HTTP Header - This allows a client and server pass additional information with a request or response. The HTTP header identifies the address of the webpage (i.e. the URI or IRI) by checking the referer, before sending or receiving information. An invalid HTTP Header will produce errors on a web page.

Hub - Hubs are authority websites. It is a trusted page with high-quality content that links out to related pages.

Hop – When user visits page A and it gets redirected to page B before landing to a page C, each redirect in ‘Redirect Chain’ is called ‘Hop’.

Clickbank uses hops as a term to show the number of affiliate link clicks to an affiliate offer. This means that when visitors click on a product, a hop is calculated. It does not indicate a sale, rather just shows a click rate for each affiliate product offer.

Back to top

 

I is for Index

Internal Link - These are links from one web page within a website to another page on the same website. Search engines give the same strong value to internal links as they do to inbound links. That is why having internal links is seen as an SEO best practice. Internal links are also important for a site’s navigation.

InBound link - Inbound links are links that are from external sources. These are the same as backlinks and are one of the major factor used in ranking a website. Search engines want to know that your content is so valuable that other sites mention you and provide your links to their users. Inbound links are especially useful when received from high-ranking, authority websites.

Insecure Content - Sometimes website users will see this message when they use a secure connection to access a HTTPS web page and are served a page with unencrypted elements. Some browsers will block the content and ask the user to add an exception before accessing the insecure content.

iFrames - An IFrame is an inline web page that is embedded into another web page. IFrames are usually used to insert content from another web resource into a web page. Advertisement elements are often located in iFrames.

Impression - Website impressions are the number of times users view a web page. Each time the web page is served an impression is counted. This data is used by website owners to analyze their website traffic and the pages viewed.

Index - An index means a structured compilation of web pages. Search Engines use sitemaps to create indexes which help users in finding relevant websites and content. It is a database of web pages that make search easy and faster.

Indexed Pages - Pages that have been crawled and processed by Search Engine spiders are called indexed pages. They are stored by Search Engines and served when users search query related keywords.

International Targeting - Websites can target users based on their location and language. International targeting is a part of international SEO and it is practised to keep a brands international presence healthy. Sub-domains, TLDs, sub-directories and URL parameters are the options used in international targeting. They ensure that your website is reaching an international audience.

Back to top

 

J is for Juice

Juice - Google juice or link juice is an SEO term that is used to describe the distribution of backlinks to a website and the value or reputation of these backlinks. Both the quality and the number of these links help determine the effectiveness of the link juice.

JavaScript - JavaScript is a popular programming language that allows programmers to significantly improve the functionality of web applications. However, in SEO, the use of JavaScript could be a problem when the scripts used prevent spiders from crawling the content, the website will not be indexed.

JSON-LD - JSON-LD is a lightweight and dynamic Linked Data format. It is easy for humans to read and write. It is based on the already successful JSON format and provides a way to help JSON data interoperate at Web-scale. JSON-LD is an ideal data format for programming environments and is the Schema markup format of choice as Google and other major search engines cite it as their preferred and best-supported structured data markup method.

Back to top

 

K is for Keyword

Keyword - A word or phrase used to find information through the search engines. Keywords can be as simple as “dresses” or as specific as “white bridal dresses.”

Keyword Density - The density of a keyword is the number of times it appears on a web page. Keyword density may no longer be as important as it was a few years back, but it is still good SEO practice to ensure that your keywords appear naturally on your web pages.

Keyword Cannibalisation - Keyword cannibalisation is the excessive use of the same keywords across several pages on a single website. It can stop search engines from choosing the page that is most relevant for a particular keyword. Additionally, your pages would compete for the limited SERP space.

Keyword Stuffing - The practice of excessively using a particular keyword on a page over and over again. Keyword stuffing no longer works in SEO and may negatively impact a page’s ability to rank well in the SERPs.

Keyword Research - Researching keywords is a major SEO activity. Before you start any SEO campaign, you need to identify the keywords that will be relevant for your campaign. The right keywords will target the best audience for your product and enable a better conversion rate.

Back to top

 

L is for Link

Landing Page - The page a user is directed to after clicking on a link. Landing pages are usually the homepage of a website, but could also be an internal page.

LSI (Latent Semantic Indexing) - LSI simply implies that the search engines will index commonly associated groups of words in a page. Not all queries involve a specific word, and many will consist of three words at least. Search engines will analyze the content on the page and serve content based on the Latent Semantic Indexing.

Link - A path that links one web page to another. The link is a clickable text that will redirect to another web page and may contain textual content, video or an image.

Link Bait – Link bait is an amazing content on your website that some other website link to. This can be an article, infographic, e-book, video or an image.

Link Building – A process of creating backlinks to your website. There are several techniques you can build links but one should be careful not to use a ‘black hat’ techniques.

Link Detox - Link detox is the process of sanitising a website’s bad links. While the tool you use may not be able to remove every single bad link, you can recover faster from a Google penalty and disavow bad links with link detox.

Link Exchange - This is one of the principal methods of link building. Link exchanges include exchanging backlinks with other websites, preferably those in the same niche as your website.

Link Farm - Link farms are websites that have no other purpose but to link out to other websites. There are many outbound links on a link farm and are all created with the single purpose of raising the other sites’ rankings. Link farms usually don’t rank well in search listings and the backlinks they provide are of little value.

The best SEO practice is to avoid creating backlinks on link farms, as the website might get a manual penalty from Google, or any other Search Engine.

Link Juice - Link juice is the value that passes from a high-quality backlink website to another site. It is used to emphasise the power of a backlink. The idea is that a link from a popular website will pass some ‘juice’ to a less popular website.

Link Popularity - A website’s link popularity is the number of inbound links it possesses. Google strongly values high-quality backlinks over the number of backlinks a website has—quality over quantity.

Link Spam - This is when a website receives/create you get a lot of backlinks in a short time, especially from low-quality sites. Most times, link spammers use automated software to get these backlinks quickly.

Link Text - Same as an anchor text and also called a link label or link title, the link text is the visible and clickable text in a hyperlink. The words contained in the link text can contribute to the ranking that a page receives by Search Engines.

Long Tail - This is different from a regular keyword because it is more specific and reflects a user’s intent. Long tail keywords are phrases with low search volume but higher in value. For example, “party” is a regular keyword with millions of monthly searches, but this keyword is low value because the traffic will not be from your targeted audience. But traffic for “party planners in Dublin” will have a lower search volume, but a higher conversion because it will be from your target audience.

Local Pack - Local business can also rank on search results too with Google’s local pack feature. It is a new way of listing local businesses in search results, where a few chosen businesses are shown at the top of the search results for a keyword.

Back to top

 

M is for Meta

Manual Action - Manual actions are taken by Google to demote or remove web pages or websites as a whole. They are not associated with Google algorithm updates and are simply a manual action to punish websites for spammy behaviour.

Mega Menu - A mega menu is usually a drop-down interface that contains information and makes navigating a site much easier for users. When users hover over a mega drop-down menu, it usually shows many more options than a regular menu. In one mega-panel are categories grouped together and in some instances, you’ll find images, content and videos.

Metadata - Metadata is information that describes other information. Meta is a prefix used as an underlying definition or description. So, metadata summarises a piece of content or page, making it easy for users to understand the content of a page.

Meta Tags - Meta tags are HTML tags, that describe the contents of your page to search engines. These tags include the meta title (page title), meta description (page description) and meta keywords.

Meta Title - This is the title of your web page in the search engines. It is visible at the top of the page in web browsers and the search results. See our blog post on how to optimise a meta title.

Meta Description - This is the most important meta tag because in it you can describe the page for search engines. The description is displayed in search results so that users can understand your page content at a glance. You need to write meaningful meta description if you want to grab the attention of your target audience.

Meta Keywords - This is an HTML tag that is used to designate the keywords and phrases a page is targeting. Today, search engines don’t take keywords into serious consideration. The best SEO practice is not to have any meta keywords on your website. Some Search Engines, in likes of Bing, consider meta keywords as a sign of a manipulation and the website might even get a manual penalisation.

Meta Refresh - Meta refresh is a way of telling web browsers to automatically refresh a web page or frame after a particular interval using HTML meta elements. It is done using the http-equiv and content parameters stating the time interval in seconds.

Mirror Site - A website that is used to host the content of another website is a mirror site. It can also be used in the context of downloads where several “mirror sites” would all provide various locations to download a file.

Back to top

 

N is for Nofollow

Natural Links - Natural links are also called ‘Organic Search Results’ and are dependent on websites natural traffic. Such links appear on the top pages of the search results and are not manipulated or dominated by any users. It is known as an organic listing.

Natural Search Results - Just like natural links, natural search results are the ones created when conducting a keyword search and are not served due to any sponsorship ads.

Natural Links – Natural Links are backlinks received organically, or natural. They are not created by any outreach or link building strategy and are generally created by the content author, as the author decided to share a backlink to your source/website, in order to learn more.

Noindex - Noindex is an HTML order given to the search engine bots telling them not to index a website or link. Noindex is used to create a crawl path and for many other deliberate reasons.

Nofollow - This attribute tells search engines not to pass a link authority to another website. Links with the nofollow attribute are not regarded as backlinks by Google. All other search engines do not take this attribute into account when ranking.

Non-reciprocal Link - When one website links to another but the second site does not offer a reciprocal link to the first, the link is considered nonreciprocal.

Non-ASCII Characters - These are characters that do not conform to the American Standard Code for Information Interchange. Most modern character-encoding systems are based on ASCII, although it supports many new characters.

batteries not provided

Not Provided - Back in the day, when people clicked through to your site from a Google search their search term would appear in your analytics account. Google no long passes the vast majority of this data to your analytics which leads to (not provided) appearing in your data. 

Back to top

 

O is for Organic Traffic

Off-Page - Off-page SEO is a method of relying on the third parties to rank your website or pages in the search results. It involves amassing relevant backlinks, references, etc.

On-Page – On-Page SEO means that the agency, or the website owner, will analyse all technical elements of the website that might be stopping the website from progressing, from an SEO perspective. In order to achieve technical excellence, a series of tests and analysis should be performed.

OpenGraph - This technology was first introduced by Facebook in 2010 and is used to integrate Facebook and 3rd party websites. By integrating OpenGraph meta tags into your website, it is easy to share and increase rankings through social media channels.

Organic Traffic - Organic traffic is the reverse of paid traffic, and simply means all visits (or sessions) not generated by paid ads. Website visitors who are considered organic find your website through search engines like Google or Bing. An organic visit/session is considered when a visitor comes to your website through Search Engine Result Pages (SERPs).

Organic Link – An organic link is a backlink to your website from another website created naturally. Most of the organic links are a result of having a great content or a useful resource on your website.

Outlink - These are links that start from your website and point to other websites on the Web. If you have too many outbound links, it could affect your rankings in a negative way, so endeavour to keep the number low.

Outreach - The goal of an SEO outreach is to build backlinks. An analysis is done and a link building strategy is designed to contact all relevant websites and pages with the hopes of generating valuable links.

Back to top

 

P is for Page Rank

Panda - This is one of Google’s algorithms which attempts to classify websites according to its content quality. Panda uses several “signals” in making this classification. Google rolled out the ‘Panda’ in 2011 and a lot of ‘low-quality websites’ were penalised.

Page Rank - Today, page ranking enables Google to index search engine results according to relevance. Back in the day, page ranking was an algorithm that utilised the number and quality of inbound links. This was exploited by webmasters, hence the new ranking methods used by Google.

Page Title - Also known as the title tag, the page title is displayed in the browser and lets users know what your page is about. Clear and unique and definitive page titles with your target keywords are great for SEO and page ranking.

Pagination - Pagination is a method of dividing a list of content pages into several distinct pages. If you have a website that has a lot of pages, you may want to add some sort of pagination to make navigation easier.

Parameters – Often we see dynamic parameters being used for tracking. URL parameters should be avoided, if possible, as they can be problematic for Search Engine crawlers, for example, http://example.com? product=1234.

Penalisation - Google penalises websites that don’t follow their webmaster guidelines and will manipulate the search results by reducing the website's rankings in the SERPs or even take extreme measures to exclude the website from Google SERPs entirely.

Penguin - Penguin was an algorithm update released in 2012. Since then, Google updated this algorithm 7 times before finally baking it into the core, real-time algorithm in 2016. Penguin penalises websites that engage in link schemes. Link farms, link exchange websites and other sites practising spammy link initiatives are all affected by Penguin once detected.

People Also Ask boxes - This is a Google feature that inserts a box of additional queries into the main Google search results. It is a brand-new way for searchers to explore queries related to theirs.

Position Zero (Featured Snippets) - Also called ‘Featured Snippets’ or ‘Answer Box’. Position Zero is the position taken up by the answers that appear at the top of search results and is usually pulled from relevant websites content. The snippet normally includes a summary of the web page and is an advantageous position for brands.

PPC - Pay Per Click simply means that advertisers will pay the website owner or publisher each time a user clicks on an ad. It is used in effective targeting and ensures that advertisers only pay for the leads they receive.

Proxy - Proxies are virtual IP addresses that can be used to change your location. They are used for SEO and allow you to run keyword ranking reports, search results in other countries, extract web data and view local display ads.

Back to top

 

R is for Rankings

Rankings - This is a website’s position in search results and is influenced by the relevance of the content to the search term, as well as the quality of backlinks pointing to the page. Search engines use certain factors in ranking websites and these factors vary from one search engine to the next.

Ranking Factor – There are over 200 ranking factors that a search engine considers when ranking a website in the SERPs. Some consider the impact of backlinks, keywords, content quality, domain age, relevance, social signals and much more.

Reciprocal Link - Also called “link trading,” a reciprocal link is when a website link is provided on a website to reciprocate a backlink in another. Reciprocal links can be used naturally or as a black-hat method in a link exchange strategy - which can result in penalties.

Redirect - Redirects are used to change the location of web resources. When a page or piece of content changes location, a redirect is normally used to take a user to the new location. See 301 and 302 redirects.

Redirect Chain - A redirect chain is when there are multiple redirects from the first URL to the destination URL. When a URL is redirected, it is done with a single 301 redirect, but a redirect chain adds up over time and can sometimes be caused by website errors.

Redirect Loop - This is a loop that occurs when URL A points to URL B and B back to A. Redirection loops will keep browsers in an infinite loop state and the web page will never load completely.

Referrer String - A referrer string is the piece of data that is sent by a browser as it transitions from one page to another. This information comprises of previously visited websites and helps programmers understand how users arrived at their website.

Regional Long Tail Keyword (RLT) - An RLT is a longtail keyword term that includes a location, city, region, or other geo-location.

Relative URL - A relative URL is the opposite of an absolute URL. It is a path that connects an absolute URL to a resource within a website. For example, the relative path ../folder/file.html is a link from the referring URL example.com.

Robots.txt - The robots.txt file tells search engines the pages and sections of your website not to index. Nonetheless, while search engines respect your wishes and don’t crawl these areas of your website, if you have sensitive information, don’t assume it is safe to use a robots.txt.

Robots Meta Tag - The robots meta tag allows you to use a granular approach to control how individual pages are indexed on your website and served to users in search results. You can use this tag to control what search engine spiders do on your site.

Back to top

 

S is for SEO

Scrape - A technique used by some software to extract data from web documents like HTML and XML files. Scraping can help a lot in the competitive analysis as well as in compiling structured data titles, keywords and content categories.

Search Engine (SE) - This is the base of all SEO activities and is the most common tool used to search for information online. A search engine is powered by a set of algorithms that proactively find resources on the World Wide Web.

SERPs - Search engine results pages (SERPs) are pages that show search results after a user enters a query in the search engines. When people say that their website appeared on a particular page in the SERPs, they are talking about the results for a specific keyword.

SERP Features - Gone are the days when a search engine result page was a simple list of websites. Now, SERP features include rich snippets, sponsored ads, position zero, local packs and more. These features add an in-depth or visual layer to an existing result page.

Search Engine Spam - Some pages are created to be deceptive and cause search engines to serve incorrect or non-relevant content after a keyword search. Search engine spamming sometimes gets confused with actual SEO.

SEM - Search engine marketing is the use of advertising to rank a website. It is the combination of PPC ads and SEO techniques and is usually effective in gaining visibility in search engines.

SEO - Search engine optimisation (SEO) techniques are used in influencing the position of a website or web page in the search listings. SEO processes help improve a website’s rank and visibility in search engines. These techniques include keywords, quality content, title tags, backlinks, site speed optimisation and much, much more!

Social Bookmark - Social bookmarking sites collect links for public access. It is a centralised service that allows users to add, annotate, edit, and share bookmarks of web resources. A popular example is Delicious which is a social bookmarking and tagging website.

Social Media - Social media communities are online websites that allow users to create, share and exchange information. Social networking is a user-generated content haven and a great tool for SEO. Nowadays, many brands depend on social media to engage their users, drive visibility and loyalty.

Self-referencing Canonical - Using these tags are an easy way fix duplicate content and can be a quick SEO boost for your website. Another great argument for using a self-referencing canonical on every page is to stop scrapers from grabbing your content. It also helps Google keep an eye on the original piece if it is reused multiple times.

Spam - Simply defined as unsolicited advertisements and messages. Spam historically started via email marketing and has now spread to other areas across the Internet. The content contained in spam messages are normally irrelevant and delivered to a large number of users for advertising, spreading malware, phishing, etc.

Spam Ad Page - A spam ad page is a page specifically made for advertising. These types of pages use machine-spun text for content and offer zero value to readers.

Spider - Spiders are computer programs created by search engines to crawl websites. The purpose is to index the content and web pages of the website.

how to find a spider trap

Spam Score - A 17-point scale coined by Moz to attribute a value to the overall ‘spaminess’ of any given website based on a range of on-page and off-page metrics. Low means good, high means bad!

Spider Trap - Also known as a crawler trap, spider traps are web pages, or part thereof, which cause a search bot or web crawler to make an infinite loop. This might sometimes be unexpected due to dynamic URLs of pages that keep looping. Spider traps can impede a crawler’s ability to efficiently crawl and index your web assets and vastly reduce your website's ranking potential.

Static URL - This is the opposite of a dynamic URL and is when the URL of a web page remains the same, even when the content changes. Search engines love static URLs that is why web administrators use a method called “URL rewriting” to make dynamic URLs more attractive and static.

Staging Domain - It is best web development practice to create a staging area for every website so that changes can be tested before deployment. A staging domain can be a subdomain of the main domain and should be restricted to prevent search engines from crawling the staging site. This is necessary because it essentially has duplicate content as the main site.

Schema.org - Schema is a universal language for communicating on search engines. Schema.org is a special language made up of tags (or microdata) that can be used to add data to your HTML. This data will help you improve the way that your page is presented in the search engines.

Structured Data - With structured data, websites can provide Google more information about its web pages such as author names, contact information, phone number and more. Structured data is used to categorise your data to make it easy for Google to index, making it readily searchable by search engines or other search operations.

Back to top

 

T is for Time on Page

Text Link - This is a plain HTML link that does not include any programming code or graphic Element. It is the hyperlink in an anchor text.

Time On Page – “Time On Page” is how long a user spends on a web page before moving to another page or leaving the website. It is usually measured by searching engines and denotes the quality or relevancy of the page’s content.

Title - All content or pages have titles. The title is what appears in search engine results and is the first thing users will see when they enter a search query. It is what is included in the title tag of a web page.

Title Tag - This is an HTML element that signifies a web page name. Title tags are shown in the SERPs as clickable text links for a given result and are necessary for SEO, usability, and social sharing.

TLD - Top-level domains (TLDs) are high-level domain extensions in the hierarchical Domain Name System. A TLD is usually located to the right of the dot “.”. For example .com and .net.

Traffic - Your website visitors are your “traffic.” Bots and spiders that crawl your website are also calculated as traffic.

Traffic Rank - Traffic rank is the measurement of how much traffic your website gets, compared to all other sites on the Internet.

Trust Authority - This is the ranking of a website based on its expertise, authoritativeness, and trustworthiness. These types of websites offer valuable content that increases the level of trust in users and search engines.

Back to top

 

U is for User Generated Content

URL - A URL is an acronym for Uniform Resource Locator. It provides a way to find a resource on the internet using a hypertext system to designate a web address. The first part of a URL identifies what type of protocol is in use (HTTP or HTTPS), while the second part identifies the domain or IP address of the resource.

User Generated Content (UGC) - As the name suggests, this is content that is generated by users. For example, various social communities, vlogs, forums, etc. heavily rely on the content created and shared by its community. UGC can be in the form of posts, videos, or images.

Back to top

 

V is for Voice Search

Voice Search - Searching by voice is a major topic for forward-thinking SEO professionals. When users speak into a device and request for an online resource, they have effectively carried out a voice search. Websites must now be optimised for natural language and to accommodate voice search queries. This means that content should also cover terms that people speak and not just what people type into search engines.

Back to top

 

W is for White Hat

Web 2.0 - Web 2.0 does not refer to any update or new technical specification, but to changes in the way we design and use web pages. It is used to describe a new generation of the internet that is focused on collaboration and usability of web properties. At its core, Web 2.0 is the transition from static HTML to dynamic websites.

White Hat - SEO practices that fall into the best-practices and guidelines set forth by Google are known as White Hat. These SEO methods are safe to use and will not get a website blocked or blacklisted. Utilising white hat SEO will also help you generate genuine traffic on your website.

Widget - These are special applications on web pages that allow users to perform specific activities. A widget could be weight loss calculator, tell the date and time, show the breaking news etc. It is usually not the main content on the page but acts as supplementary content.

Back to top

 

X is for XML Sitemap

XML Sitemap - An XML sitemap is a sitemap presented in XML format. The main function of this sitemap is to collect all the URLs available on a website. It is usually used by search engines to find pages for indexing.

XML Sitemap Index - A sitemap index file is a collection of individual sitemaps. It is a sitemap used to manage multiple XML sitemaps. If you have several sitemaps, you can use index file to group them. The format of the index file is the same as a regular XML sitemap.

X Default Tag - This new attribute signals to Google that the page is not targeting any specific language or locale and should be regarded as the default page. With this tag, webmasters can specify international landing pages.

X Robots Tag - The X-Robots- Tag is a component of the HTTP header response for URLs. It can be added to a sites HTTP responses using a httpd.conf and .htaccess files. It allows webmasters to specify the crawling directives that will be globally used on the website.

Back to top

SUBSCRIBE TO OUR NEWSLETTER

Your One-Stop-Shop for the Essential Digital Marketing News

Contact Wolfgang Digital

Phone +353 1 663 8020

Email hi@wolfgangdigital.com

HQ Palmerston House Denzille Lane Dublin 2

Wolfgang Digital

Want To Talk Digital Marketing?

Let's Get Started!