SEO Basics: Forensic Examination of Search Engines For SEO

Updated: 21 hours ago


Forensic Examination of Search Engines For SEO

Search Engines For SEO:

Search engines are the SE component of SEO, without them you just have to O i.e., optimization. As a rule, SEO Results = Search Engine Understanding x Website/Content Optimization. This means without a firm understanding of search engines, you will never be able to reach a high rank on the search engine results page (SERP). To that end, in this section of our SEO series we will focus on the 'search engine understanding' component of SEO.

SEO Formula

What Are Search Engines?

Search engines are not overly difficult to understand. As a rule, search engines are fact finders. They find facts by searching the web for information specified as textual search queries. Search results are then presented as a line of results, known as the search engine results page (SERP). In layman's terms, when you type text into the search bar of Google trolls through millions of web pages to bring you the most relevant results.

SEO Search Engine Results Page

How Do Search Engines Work?

Search engines work by crawling, indexing and ranking billions of pages on the internet. First, search engines use crawl bots to follow new hyperlinks on its existing index to discover new information. Second, new information discovered during the crawl phase is added to the search engine index. Finally, new information is ranked and appears on SERPs following a relevant textual query. A day in the life of a search engine typically looks like this:

  1. Crawling: The crawl bot visits new/updated pages and adds them to the index.

  2. Indexing: Algorithms take into account user demand and perform quality checks.

  3. Ranking: The end product is an up or downgraded ranking on the SERP page.

How Search Engines Work
  • Crawling: Crawling is the process of scanning the web continuously. Search engines scour the web with automated programs called crawl bots. The core aim of crawl bots is to seek out pages that are new or have been recently updated. Crawl bots then follow hyperlinks on new or recently updated pages and store page addresses (or page URLs) in its library to review later. Crawl bots find pages using a mixture of different techniques, but the main method is following links from pages that they already know about. From an SEO perspective this is critical. Why? by adding links to your website on another website search engines like, it is more likely to crawl and rank your page quicker.


  • Indexing: Indexing is the process of organizing information into a search engines massive library. At this stage all search engines try their hardest to analyze and understand what the pages they have reviewed are about. Instead of conventional librarians search engines use code and bots trying to assign a website to the right category. This information is then stored on the index, and is prepared to send data to the SERP as and when it is searched for. So, on that note, if your website is unclear or difficult to understand search engines might miss categorize it and you will get either get no traffic or a lot of irrelevant and annoying traffic routed to you site.


  • Results: When a user performs a search, the search engine tries to determine the highest quality results it should deliver. Search engines do this by digging into the index and pulling out the best result by using a ranking algorithm. As a general rule, the best results are driven by many factors, including things such as the user's location, language, device (desktop or phone), and previous queries. The list you see is known as the SERP i.e., the search engine results page. Its important to note that from an SEO perspective search engines do not accept payment to rank pages higher, and ranking is done purely and solely in a algorithmic manner.


Google Algorithm:

Currently Google dominates the search engine landscape, with a whopping 90% market share. This is why SEO is quickly being referred to as ‘Google Optimization’. However, there’s no need to worry if your customer base stems from Bing or yahoo etc. what works in SEO terms for Google generally works on other search engines too.


Google uses a series of algorithms to rank web pages on their SERPs. Googles aim is to measure the importance of a website. It does this by counting the number and quality of links to a page among many other things (see below). Doing this gives Google a good estimate of how important a website is.



Google’s Algorithm is a term that refers to all the individual algorithms, machine learning systems and technologies Google uses to rank websites. Although, no one knows what the exact algorithm is we do know that minor changes occur daily, weekly and monthly. What's more Google typically rolls out major algorithm updates once yearly.


In order to feed the algorithm, its important that we follow some best practices:

  • Regularly publish keyword optimized content

  • Build as many legitimate quality backlinks as possible

  • Make sure that your website is both user and search engine friendly






15 views