Google has been the top most used search engine for over a decade now and also established its place as one of the tech giants across the world. A personal view towards the success of the search engine would be the quality of search results. Unlike other search engines, the focus is placed on providing web pages with actual content rather than redirect sites.

Google, being a leader, has been the most wanted target of web publishers, content writers and bloggers to gain a better rank for their webpage/blog so as to increase their audience reach. Like every other search engine, Google uses a special algorithm to find the required information pertaining to searches. Though the specifics are a secret hidden by the company, a general trivia of how algorithm works is what could be shared.


The Google has automated bots called “crawlers” or sometimes also referred to as “spiders”, just like any other search engine. Whenever a search is performed, crawlers search their existing database of content, also known an index. When spiders crawl the web, they add content to index, also updating the data pages as much as possible, this part is also known as Indexing.

A crawler finds new data through algorithmic process. As the process begins with a list of URLs from previous crawls, the spider visits new pages, sites and adds them to list of sites it’s crawled in the past.


Google revealed one of its various algorithm called Pagerankwhich assigns each web page a relevancy score. This algorithm depends on various factors-

  1. Frequency of keywords within a webpage i.e., a web page with few keywords would receive low score.
  2. A web page which has stuck around for quite a time with a good history is valued more by the algorithm.
  3. Google checks the number of web pages linked to a particular site. As more the number of links attached, the higher would be the value.

The relevance of the content is checked firstly through indexer, then is further processed through ranking system. The relevance to the topic is determined through anchor text and other keyword, establishing a mix of various factors. For e.g. a page with more anchor words and keywords would be provided more relevance than others.


An algorithm, made to prevent manipulation the search results and thus adhering to no spam policy of the search engine. Age of the content and that of the domain builds are a part of trust metrics. One of the heads up regarding the metrics would be too avoid having too many links from bad neighbourhood as not only they prove to be useless but also they also make it hard to receive a good score.


One may have noticed various results in different types of devices i.e. your search results in mobile will differ from that on a desktop due to the difference in indexer. The results may differ as per localized searches and different formats. The search engine may show you different results depending on the terms you use.

The understanding of how algorithm works would prove useful to bloggers and web publishers to create a strategy to increase their ranks in the search engine. If the information felt too loaded and overwhelming, feel free to reach out at for a helping hand regarding your queries.

Leave a Reply

Your email address will not be published. Required fields are marked *