Source: Medium

How Does GOOGLE Deliver Fast Reliable Results?

There are over 70,000 searches made on Google every sec which will translate to 5.8 billion searches per day and 2 trillion per year. And, as per Statista “Google has dominated the search engine market, maintaining an 86.86% market share as of July 2020″.

Google has achieved this milestone by continuously delivering the most relevant results in the shortest time (secs). Since the very beginning, Google started investing its time in improving the quality of answers to a search through a complex list of algorithms and technologies, for example, RankBrain, helping guess what you’re looking for (even if you don’t type it in). You could easily launch Rank Tracker, sync it with your Google Analytics account and switch to Organic Traffic and monitor your site’s traffic changes.

FAQ 1: How do I trust the information on Google?

The majority of online traffic is driven by Search engines alone apart from other channels too like paid advertising, social media and more. So, SEO has 20x more traffic opportunities than Paid advertisements both on mobile and desktop. In comparison, organic search results cover more digital property (Digital Real Estate) because its more credible in a way and receive way more clicks than paid ads. For example, less than 10% of people click on paid ads. SEO, no doubt, is one of the most cost-effective marketing channels that if implemented with sincere intentions could pay profits over time. Because no matter what you might have read online, Google is still the good guy, they appreciate solid pieces of content when they rank pages for the right keywords. Their old Company motto “Don’t Be Evil” is still instilled in them in order to fight websites who try to Game their systems or users.

TBH, Google is the smartest Search Engine out there but still needs a little help when it comes to proper indexing and displaying your content within search results.

How do search engines work?

Search Engines, primarily have 3 functions:

  • Crawl: Search content new or old on the internet
  • Index: Store and organize the content found during crawling so that it could be displayed as a resultant to a relevant search/ query.
  • Rank: Ranking the pages as per relevance to the search/ query.
Google Web Crawling
Source: NeilPatel

What is Crawling?

The 1st job of Google is to CRAWL the web with SPIDERS. The search engines (Google) sends out a team of spiders (crawlers or robots) to find new and old(updated ) content. The format of the content can vary – webpage, video, image, pdf etc – but the discovery of content is always by links. These Spiders learn about what you do, who you are and who may be interested in finding you.

  • Starts fetching a few web pages
  • Follows link on these webpages to find new web links
  • Doing this helps search engines to find new content and add it to their index called Caffeine – a massive database of discovered URLs
  • These URLs will be retrieved when a search seems like a good match.

What is a Search Engine Index?

Google is the most popular search engine, with over 70% of the search market share. Why? It finds and records more information and delivers the most accurate results faster than its competitors. Indexing makes sure that the recorded information is organized to make super fast and relevant responses (within secs) possible.

Search Engine Ranking

When someone puts a search keyword or phrase, the search engine scours their properly organized index for relevant content to solve searcher’s query. The order of these results as per relevance to the search is known as ranking. Higher relevant content higher website ranking for that search/ query. If a website is not showing up anywhere in the SERPS it could be a result of one or more of the following:

  1. Brand new sites who haven’t been crawled yet.
  2. Search Engines couldn’t find you linked to any other genuine website through which it would have reached you.
  3. A website’s navigation is hard for a crawler to crawl effectively.
  4. The website contains some basic code called crawler directives blocking search engines.
  5. A website has been penalized by the Search engine for spammy practices.

Why Does Google Keeps Changing its Algorithm?

Google claims that it updates its search algorithm several 1000 times a year. Although the majority of these updates are too minor to notice they do disrupt the SEO tactics that Experts become comfortable with

What does Google want? Well, the idea behind a search engine is to improve the quality of search and Google has been doing it better than anybody else and wants to do so in the future. Read these Google’s Quality Guidelines or Search Quality Rater Guidelines.

Below is a list of the 7 most crucial search algorithm changes. Why were they introduced, how they work?

7 Search Algorithm Updates

1. Panda (named after Navneet Panda)

Problems: Plagiarized or thin content; keyword stuffing, Duplicate Content; user-generated spam

How it works: It keeps a list of questions in mind while ranking websites most of which are related to the content as stated below:

Would you trust the information presented in this article?
Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Is this article having spelling, stylistic, or factual errors?
Does the article provide original content or information, original reporting, original research, or original analysis?
Does the page provide substantial value when compared to other pages in search results?
How much quality control is done on content?
Source: SearchEngineJournal
Source: infidgit

The Panda algorithm update: The Panda algorithm update assigns each website in its index a particular quality score as per the questions above. Now this score will help in ranking the website. The effects of the algo have been in full swing since 2016, and penalties and recoveries happen faster.

Source: Neil Patel

Note To Self: A lot many potential penalties could be avoided by analyzing the pages of the top competitors and maintain your content likewise. Making sure the plagiarism is minimum as it could be.

2. Penguin (October 22, 2019)

Problem: Spam or irrelevant links, links with over-optimized anchor text.

How does it work? Its major objective is to downrank sites whose backlinks doesn’t look reliable, putting an end link that is bought from link farms and PBNs.

Peguin Google Algorithm update
Source: SearchEngineJournal

Note to self: Investigate Links with risk above 50% and if found malicious add to the disavow file, download it, and submit it to Google’s Disavow links tool.

3. Hummingbird (August 22, 2013)

Problem: Keyword stuffing; low-quality content.

How does it work? This algo interprets the intent of the search query and not the keywords and provide results. It focuses on ranking a page based on a query even if it doesn’t contain the exact keywords the searcher entered. It is possible with the help of Latent Semantic indexing, which means how a term and a content mean the same thing even if there’s no keyword in common.

Note To Self: Analyze the concepts, synonyms and co-occurring terms behind a keyword. Create comprehensive content that matches the searcher’s intent which will win both in terms of SEO and engagement.

4. Mobile (April 21, 2015)

Problem: Poor Mobile usability

How does it work? The update (2018, 2020) focuses on the mobile-friendliness of a website, and then ranking them based on how fast and user-friendly they are?

Mobile Google Algorithm update
Source: CoreOnlineMarketing

How to adjust? Optimize pages for mobile usage if its fast and easy to use.

5. RankBrain (3rd Most Important Factor) (October 26, 2015)

Problem: Lack of search keyword -specific relevance; shallow content; poor UX.

How it works: It is a part of Google’s Hummingbird algorithm and based on Machine Learning helping Google understand the meaning behind a search and then display the best matching result. It achieves this by taking into account synonyms, personal search history or implied words.

Rankbrain Google algorithm update
Source: BackLinko



Note To Self: Optimize your pages for relevance and comprehensiveness with the help of competitive analysis.

7. Bert (October 22, 2019)

Problem: Poorly written content; lack of context, lack of focus.

How it works: This update uses NLP (Natural Language Processing) technology to better understand searches, interpret the text, identify entities. BERT update is the conclusion of the effort of helping Google comprehend more similarities between a search and a search result.

Bert Google algorithm update
Source: DeepLearninganlytics

Note To Self: Guide on using entities in SEO

8. Core Updates (2017 – Present)

How it works: As far back as 2017, Google refers to its Bigger updates as Google core updates. There is even less transparency about what these updates are otherwise people will find tricks and tactics around them too. These core updates might be just improvements on the previous ones.

Google Algorithm Core updates
Source: Medium

Conclusion

When Search Engines were just beginning their journey and learning user behaviour and language, some people did go against the quality guidelines using tricks and tactics. For example: If we consider keyword stuffing, and rank a keyword like, “Food Recipe”, you might add the keyword a bunch of times in hopes of attracting traffic such as:

We offer the best “food recipes”. Ask a “food recipe” we know it. Keep scrolling for amazing “food recipes”.

This trick leads to a bad user experience and there are fair chances that it might have worked too in the past but it goes against the delivery of quality content to the viewer. It’s exactly what Google has been avoiding and will remain to do so. It’s kind of war against web spam and black-hat SEO.

For the past few years, Google has been rolling out several core algorithm updates with the goal of delivering relevant authoritative content to its users. Especially when the searches involve very sensitive content such as medical, political, financial, and legal. What makes these algorithms even more so interesting is EAT (expertise, trustworthiness, and authoritativeness).

  • 8
    Shares
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments