This week, the leak about the Google algorithm caused a significant uproar in the SEO world. I shared my perspectives on this matter.
The incident broke out after Ipullrank’s Mike King posted a post revealing that Google Search-related internal APIs had been leaked.
King shared his comments on the leaked information in detail in this article. Afterward, some accused King and claimed that he was speculating. Until Google admitted to this leak and said, with its classic habits, that we should not pay much attention to the document.
You have probably read about this development in many different channels.
Google is one of the most secret, closely guarded black boxes in the search world. In the last quarter-century, a leak of this magnitude or detail has not been reported from Google’s search division.
Based on my experiences, I commented on the lessons we can learn from this leak. First of all, we need to approach the issue very cautiously.
Three things to keep in mind about the Google leak:
- This leak is not the entire algorithm. Google is a $2 trillion company, and it doesn’t keep its most important trade secrets in a single file that its nearly 200,000 employees can download. If this were the case, he would have already obtained this information by deceiving a rival employee with a large bribe.
- There is no information on how the weights were used in the leak. These factors or signals may be included in the algorithm, but it is impossible to understand how and when they are used.
- Search algorithms are not based on a fixed list like “200 ranking factors”. Even if you get a perfect score, it won’t win you an SEO medal. Many factors and signals work together in real-time, providing visibility. Therefore, even if a site copies exactly what its competitor does, there is no guarantee of success.
It is impossible to say that the Google Search API Leak will completely change our SEO methods. But if you’re not changing anything, there are only two possibilities:
- You don’t consider what’s in the content, you don’t test it, you don’t find reliable sources and follow their research and tests. So, you don’t improve on new information, and that was a big mistake throughout my years of doing SEO.
- Before this leak emerged, we were intuitively familiar with the thousands of potential inputs into Google’s Google’ssystems.
If years of experienced SEO expert Michael King is surprised by what is written in these documents, you should be too.
SEO Lessons I Learned from the Google Leak
- Navboost Data: Click data obtained from Google Chrome is essential in ranking algorithms. Google persistently denied this. However, it has now been revealed that this data may be considered when determining the rankings.
- Quality Rater Feedback: We thought Google’s Google’sRaters only detected general trends. We were surprised by the possibility of such intense use of these people’s people’s in ranking algorithms.
- Toxic Backlinks: The Google algorithm has penalization mechanisms for bad backlinks.
- Site-Wide Title Compatibility: Page titles must be consistent and compatible throughout the site. Don’t think this is an outdated method; make your titles compatible with the search queries.
- Likelihood of Google Ads Affecting Organic Results: If Google Chrome data is used for ranking tracking, paid ads may also affect the rankings.
- Authors and EEAT: According to information leaked from the documents, Google values the authors’ credentials and experience on the subject. Author credentials and working with writers who are experts in their fields can provide an important advantage.
It would be beneficial to remember this information in your future SEO efforts. Even though Google tells you to ignore these, what I shared above is an essential lesson for all of us.
Having said all this, I must also say that Google is obsessively focused on AI Overviews, developed in response to Bing and OpenAI in the artificial intelligence race, and has no intention of returning from it. So, this leak may not have bothered them as much as we thought. According to them, it represents an older and outdated algorithm. I will share the developments on this issue in the next title.
Post Views: 680