A/B testing is part of the User Experience (UX) research to optimise conversion rates on a website. Before the testing phase, UX professionals research the target audience, competitors' websites, and the existing page to develop hypotheses. The experts then create two new versions, A and B, of the site to test each one's effectiveness before deciding which version to use. The testing and analysis phase is continuous as new technologies become available and users' behaviour changes.
An algorithm is a mathematical term that refers to a sequence you follow when solving a problem. IT professionals use algorithms in several ways. Search Engines utilise algorithms to find the pages to display in your search results. Two of Google's algorithms are the search string (Matching results to keywords that you type into the search bar) and search operator (Looking up related, indexed pages). The search engine uses a mixture of algorithms and various ranking factors to deliver web pages ranked by importance on its search engine results pages.
Some algorithmic changes may go entirely unobserved. However, the influence of a significant algorithmic change can usually be seen relatively speedily, though the change occasionally takes a couple of weeks to roll out completely. Algorithmic changes are available in three ways:
It’s an HTML code that gives information employed by search engines and monitors readers (for blind and visually compromised people) to show an image's contents that cannot be displayed.
They are also referred to as Alt Text.
AMP pages have been optimised for mobile browsing by stripping non-essential HTML code, allowing the page to load faster on mobile devices. An AMP page only shows the crucial information on the page, showing less than a desktop version.
The discipline of gathering, examining, and rendering data used in future action to support anything that has (or hasn't) worked traditionally and increase your site's traffic.
Anchor text is the text used to anchor a link to another page or website. By default, the anchor text is blue and underlined, but you can change your website's colours and styles.
App Store Optimization (ASO) improves an app's visibility in an app store. App stores rank each app according to specific criteria. Similar to SEO, ASO has on-page ranking factors like app title, description, and search keywords. Off-page ASO includes considerations like the number of downloads, conversion rate, reviews, and crash rate.
The average time a user spends on a website is known as the average visit duration. The average visit duration for all websites is between two and three minutes, an average visit duration of over three minutes is considered as good. Many factors influence the average visit duration, so you should always combine other metrics in your Website analysis.
B2B is a marketing strategy where a business concentrates on selling products or services to other businesses rather than individuals.
B2C refers to a business that sells directly to the customer rather than another business.
A backlink is a link from another site, directing users to your site or page. You can improve backlinks by sharing content that is interesting and relevant. Although backlinks indicate a site's popularity, poor quality or irrelevant backlinks harm a site's rankings.
Baidu is the dominant Chinese internet search engine company, and it's the equivalent of Google in the United States. it's is similar to Google and offers similar products and services, but its main focus is China, where it controls most of the search market
Bing is Microsoft's search engine. Bing launched in June 2009, replacing Microsoft Live Search (previously MSN Search and Windows Live Search). Since 2010, Bing has driven Yahoo's natural search results as part of Microsoft and Yahoo search deal in July 2009. Bing can understand various types of media data, e.g. video, audio.
A black box is an intricate computer program that needs further analysis and understanding. Inputs and outputs are often observed, but there's no admission to the method itself, thanks to its restricted character. For example, Google's algorithm is a black box.
Black hat SEO is attempting to improve a site's rankings by violating the terms and conditions set out by the search engines. Black hat SEO isn’t easy to police, but most search engines find these sites and penalise them in the search rankings.
A Web crawler, sometimes called a spider/spiderbot or crawler, is an automated online robot (or bot) program that systematically searches or browses the World Wide Web, indexing and saving information, content and data., These are typically operated by search engines for Web indexing (web spidering) so that those indexed websites can appear in the search engine results.
A directional component that swiftly shows users where they are in a website.
A tool that momentarily stores web pages, like images, to scale back the time it takes to load any future pages.
A call to action is an encouragement, invitation, or request for a user to complete a certain action. The "Buy now" button is an example of a call to action. "Register", "Sign-up", or "contact us" are also examples of calls to action. An effective CTA is visible without being intrusive.
A canonical tag specifies to a search engine that a page is a master copy and eliminates all duplicate URLs. Often backlinks from social media or other outside sources create unique URLs for the same page. Search bots see these as separate pages on the same site with duplicate content. The canonical tag designates one URL as the master, preventing your site from being penalised for duplicate content.
Cascading Style Sheets describe how HTML elements (e.g., colour, fonts) should appear on Web pages and adapt when viewed on different devices.
CSS Sprites combine multiple images into one image file to be used on an internet site to assist with performance.
ccTLD stands for “country code top-level domain”. For example, a corporation based within the UK would have a website ending .uk.
A content delivery network (CDN) refers to a distributed group of servers that provide fast Internet content delivery.
The rate (shown through a percentage) at which searches click on a search result. Commonly used to measure the success of an online advertising campaign for a specific website.
Cloaking sites show altered content or URLs to people and search engines. Cloaking is a violation of Google's Webmaster Guidelines.
Content is words, images, videos, or sounds (or any combination thereof) conveying information to an online audience. It assists your web pages to get higher rankings in search engines. There are three elements to need to ponder: site structure, keywords strategy and copywriting.
It is the end of the user's trip through your site. Users usually return to start a new trip.
The rate (shown by a percentage) at which users complete a chosen action (call to action). This is determined by dividing the total number of conversions by transfers, then multiplying by 100.
Cookies are text files with small pieces of knowledge — sort of a username and password — to identify your computer as you employ a network. Specific cookies known as HTTP cookies are being used to identify specific users and improve your web browsing experience.
A computer program that searches for documents automatically on the web. Search engines employ crawlers to download parts of web pages, checking and validating the code and links. The crawler also indexes the content so that the search engine can retrieve it later.
CRO stands for conversion rate optimisation, is the routine of increasing the percentage of users who execute a call to action on a website.
All the complex numbers representing real customers. SEO professionals analyse the following data to optimise a website:
When Google removes an internet site or webpage, either momentarily or forever, the page is de-indexed. Google can't provide a Remove URLs tool within the Search Console for voluntary cases. However, Google can de-index an internet site as punishment for violating Google's Webmaster Guidelines.
A high number of spammy, artificial, or low-quality inbound links will harm your rankings. You can inform Google to ignore these links with the Disavow tool if there's a legitimate reason why you can't remove them.
Dofollow links allow google (all search engines) to follow them and reach a particular website. The SEO group call them “link juice.”
A website address – typically ending in an extension like .com, .org, or .net. For example, www.google.com is the domain of a website. The domain should have its own characteristics that makes it easy for searchers to remember your website.
The overall "strength" of an internet site, built up over time, can help a replacement page rank well quickly, even before that content has earned links or engagement.SEO software company Moz uses to envisage a website's capability to rank in search results.
Doorway pages, also known as jump, entry and bridge pages, are web pages that are created for the deliberate manipulation of search engine indexes for a higher ranking. A doorway page will modify a search engine's index by adding results for certain catchphrases while directing visitors to a similar page.
The total amount of time elapsed between a user clicking on an enquiry result and returning to the SERP from an internet site. Brief dwell time is often an indicator of low-value content on search results in an engine. Dwell time is an important ranking sign.
External Links are hyperlinks that point at (target) any domain other than the domain the link exists on (source). If another website links to you in layman's terms, this is considered an external link to your site. Similarly, if you link another website, this is often also considered an external link. External links are a very important supplier of ranking dominance.
Favicon is short for a favourite icon. It's a small icon that identifies your page if a user has several tabs open. Facebook's white F on a blue background is an excellent example of a favicon.
How simple the content on a website can be found, both inside and outside, by online searchers.
A footer link is a link that appears in the bottom section of a website. It links in with the rankings and accomplishment in search results.
The search engine was founded by Larry Page and Sergey Brin in September 1998. Google marked a major departure from edited web directories, counting on web crawling skill and a posh algorithm to research hyperlinking patterns to rank websites. Google is the most-used program in nearly every country within the world.
A Google bomb is intended to form an internet site rank favourite for an unexpected or contentious search phrase. This was achieved by having many websites link to a particular webpage with specific anchor text to help it rank for that term.
The web-crawling system Google uses to find and add new websites and web pages to its index.
A series of updates and refreshes following a primary Google algorithm launched in April 2012, rank to match websites. Penguin's goal was to scale back the view of overly-optimized sites or sites that extremely abused specific spammy tactics. (e.g., building low-quality links, keyword stuffing). In 2016, Penguin started running in real-time as a neighbourhood of Google's core algorithm.
Google sandbox is a name given to a study of the way Google ranks web pages in its index (but never confirmed by Google). The "waiting period" that forestalls new websites from seeing the complete advantage of their optimisation efforts. Typically, this effect is witnessed with new sites targeting competitive keywords and may only be overcome when the location gains enough authority.
A grey hat is a computer hacker or computer security expert who may violate laws or accepted ethical standards but doesn't have malicious intent.
The default, or introductory web page, of a website, is known as the homepage.
Stands for Hypertext Markup Language. HTML tags are specific code elements that are utilised to build web pages, and that can improve SEO effectiveness for web pages and websites.
Generally, when a URL of a website is changed, all links of that link are permanently re-orientated for the transfer of values.
302 redirect: 302 redirect
which means temporary routing, and is usually the answer code given when the site is being serviced. The original page will reopen after a specified time. Link transition with 302 redirection cannot be provided to the new page.
The 307 redirect is used for temporary routing when a resource or page is temporarily redirected to another resource, not permanent.
The 302 redirection is temporarily relocated to the 302 message, while the 307 temporarily sends a resend message.
400 Bad Request error is an HTTP status code that indicates that the request sent to the server is incorrect and the server cannot understand it.
Specifies that the request is not successful as that there is no valid authentication information for the target resource.
The client should not repeat the request with the same credentials. The client can repeat the request with new or different credentials.
If the server wants to clarify that the request has not been met, the client should disclose the reason for the rejection.
If the server does not want to present this information to the client, it can use 404 status code instead of 403.
404 not found
Pages displayed when a user tries to reach a page that is not on the website and the 404 error code is called 404 error pages.
The 404 error pages must return 404 error codes so you can identify pages that don't work on your site, and search bots don't waste time browsing the error pages on your site and can easily discover your original pages.
500 server error
500 server error indicates that the server is experiencing a condition that prevents it from fulfilling the request.
A general error message given when a more suitable message is not found.
503 service unavailable
Indicates that the server cannot perform the request due to overload or maintenance.
The status code 503 also indicates that search engines will return shortly, as the page or site will only be unavailable for a short time.
Indexing in SEO refers to search engines keeping a record of your web pages during the crawling stage.
How easily an enquiry engine bot can understand and add a webpage to its index.
A web page discovered by a crawler that has been added to an enquiry engine index. It is eligible to appear in search results of relevant queries.
Keywords are ideas and topics that define what your content is about. In terms of SEO, they're the words and phrases that searchers enter into search engines, also called "search queries."
How often a word or phrase occurs in the content of a webpage. There's no ideal percentage that will help a web page rank better.
Refers to any webpage that a visitor can navigate to. A standalone web page that is designed to capture leads or generate conversions.
Latent semantic indexing (LSI) is a notion used by search engines to ascertain how a term and content work simultaneously to mean the same thing. They don't need to share keywords or synonyms.
A connection between two websites built using HTML code. A link enables users to navigate to websites, social networks, and apps. Links play a critical role in how search engines evaluate and rank websites. A web page can also employ internal links to different pages within the same site or external links to different websites.
Intentionally provocative content that's meant to grab people's attention and attract links from other websites.
Link Equity is a search engine ranking factor based on the idea that certain links pass value and authority from one page to another.
When a group of internet sites links to each other, usually using automated programs, within the hopes of increasing search rankings. A spam tactic.Also known as Link Network, Blog Network, Private Blog Network
A link profile is the makeup of links pointing to your site and improves the ranking.
A search query is a text entered into a search bar by a user. Search queries fall into three broad categories:
Search volume is the number of times a keyword is used in a search engine over twelve months. Keyword research is a vital part of SEO optimisation. Sometimes a lower-volume, less competitive keyword gives better results than high-volume keywords.
A taxonomy is a group of URLs with a common quality which therefore share importance with one another. URLs' taxonomy doesn't need to follow a specific URL structure, nor does it have to sit within the same design depth as the homepage.
A ballpark estimation of time a user spent looking at a particular webpage. Pages with high exit rates can significantly skew this data.
Any form of content created by customers or users is a form of UGC. The content can include reviews, videos, comments, blog posts, etc.
Data that is being pulled from multiple speciality databases by search engines to display on the same SERP. Results can include videos, images, shopping, news, and other types of results. Also known as Blended Search.
Any links that are identified as suspicious, deceptive, or manipulative by Google. Google can take manual action on your website if an unnatural link is detected.
A specific string of characters is a Uniform Resource Locator that leads to a resource or specific website on the internet. URL is usually short-hand for the letter-based web address (e.g. www.worldweather.com) entered into a browser to access a webpage.
Information that is followed by a question mark that is attached to a URL to alter the contents of the page and follow the information.
The level of ease to use a website. Disability enhancements, browser compatibility, site design, and other factors all play a role in improving usability and making your site more user friendly to as many people as possible.
A user agent is any software that retrieves and presents Web content for end-users or is implemented using Web technologies. User agents include Web browsers, media players, and plug-ins that help retrieve, render, and interact with Web content.
The overall feeling after interacting with a brand, its product/services, and its online presence.
The user interface is the way the user interacts with a page but also refers to the overall design of the page. UI includes the colours, sounds, and animations on a page and is also dependent on the device used to access the site.
A type of search that specialises only on a specific topic, type of content, or media. For example, Yelp (business reviews), Amazon (shopping), YouTube (video), Kayak (travel).
It understands how important optimising your content for search engines is to your business and its online success.
A measure of clicks from SERPs a certain website is receiving for a keyword.
A voice-activated technology allows users to talk into a tool (usually a smartphone) to ask questions or conduct a web search.
A webpage is an existing document on the World Wide Web that web browsers can view.
Web pages that are hosted together on the World Wide Web.
A website connects to different web pages to help visitors navigate that site.
Methods to deceive or manipulate search engine algorithms and users. Also known as Spamdexing, Black Hat SEO, Search Spam, Spam
An ethical computer hacker uses hacking skills to identify security vulnerabilities in hardware, software or networks.
The number of words that appear within the copy of the content.
A blogging and content management system that is very popular.
Extensible Markup Language is what search engines use as a markup language to understand website data.
It is an XML file that lists the URLs or a site. This enables search engines to crawl sites more efficiently and finds lost URLs.
It is an American web services provider. It has a web portal, search engine and other services such as Yahoo! Mail, Yahoo! News etc.
It is Russia's most popular search engine.