How Many Search Factors Determine Your Organic SEO RANK?

2024 Serving Factors

When a user enters a query, our machines search the index for matching pages and return the results we believe are the highest quality and most relevant to the user’s query. Relevancy is determined by hundreds of factors, which could include information such as the user’s location, language, and device (desktop or phone). For example, searching for “bicycle repair shops” would show different results to a user in Paris than it would to a user in Hong Kong.

Based on the user’s query the search features that appear on the search results page also change. For example, searching for “bicycle repair shops” will likely show local results and no image results, however searching for “modern bicycle” is more likely to show image results, but not local results.

In comparison, 2013 Serving Factors

Serving results

When a user enters a query, our machines search the index for matching pages and return the results we believe are the most relevant to the user. Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site’s PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content.

In order for your site to rank well in search results pages, it’s important to make sure that Google can crawl and index your site correctly. Our Webmaster Guidelines outline some best practices that can help you avoid common pitfalls and improve your site’s ranking.

Google’s Did you mean and Google Autocomplete features are designed to help users save time by displaying related terms, common misspellings, and popular queries. Like our google.com search results, the keywords used by these features are automatically generated by our web crawlers and search algorithms. We display these predictions only when we think they might save the user time. If a site ranks well for a keyword, it’s because we’ve algorithmically determined that its content is more relevant to the user’s query.

updated 05/27/2013

ORGANIC SEO IS NOT DEAD

What exactly is Organic SEO? When you perform a search on Google, you’re presented with a page featuring relevant results. Typically, this includes three ads (though the number can vary based on competitiveness), followed by Google’s local results, images, and then the organic search results. These organic results aren’t paid placements like AdWords; they’re determined by relevance, occupying the prime real estate on Google’s page. The higher your ranking, the more organic clicks you’re likely to receive.

How do these organic results operate? Explaining the intricate workings of organic ranking would require a tome of over 200 pages. For brevity, let’s distill it down to some fundamental principles:

Relevance, Relevance, Relevance: It’s all about being the most pertinent. The page that’s most relevant to the searched keyword(s) clinches the top spot.Brand vs. Generic: Brands play by different rules than generic keywords. Creating relevance involves various tactics, such as registering a domain and adding content.Example: Consider a fictitious word like “chiklany.” By registering the domain www.chiklany.com and crafting content, you could swiftly secure the top rank for this term.

However, it’s not all straightforward, especially when contending with highly competitive keywords like “hotels,” where millions vie for top relevancy. While Google no longer shares p
ublic data on organic results, years of evidence suggest that users overwhelmingly prefer organic results due to their superior relevance.
But how is this relevance forged? Google’s primary aim is to furnish users with accurate results efficiently. To achieve this, they employ thousands of algorithms and over 200 filters. The result must not be spam, should offer valuable content, and aim for utmost accuracy to swiftly satisfy user queries.

Google meticulously collects data about websites, storing it in a vast database. This data, alongside algorithms, determines a site’s placement in the top 100 results. If deemed spam or illegal, you won’t rank at all. However, if your data suggests maximum relevance, you could clinch the top spot.

Google’s filters, whimsically named after animals like Panda, aim to combat content farming and distinguish genuine from fake content, showcasing Google’s commitment to relevance and quality.

The significance of Google’s 200 factors cannot be overstated.

Before the advent of Panda in 2011, Google operated with only a fraction of the filters it possesses today. These factors, totaling over 200, intricately influence the organic ranking of websites, with each factor triggering thousands of algorithms per millisecond. Here’s a snapshot of this complexity, as outlined by Google in 2013:
Many of these factors directly pertain to the website itself. This begs the question: How crucial is it for website developers to grasp the intricacies of building a site that meets the criteria of Google, as well as other search engines like Bing, Yahoo, and China’s Baidoo?

Each “factor” encompasses both on-page and off-page attributes of the site. For instance:

  • Factor 1: Domain name.
  • Factor 2: Brand age.
  • Factor 3: Age of the brand.
  • Factor 4: Reputation of the brand.
  • Factor 5: Competition level for new brands.
  • Factor 6: Relevance in relation to competitors.

And the list goes on, delving deeper into the fundamental components of your platform.

RANKS 200 Factor Research and Development

In the early 2000s, RANK pioneered a groundbreaking DOS-based application capable of simulating Google’s 200-factor algorithm of that era. Operating within a sandbox environment, the application analyzed 1000 industry-specific niche sites, akin to Google’s search results. The program, dubbed the SEONATOR, meticulously scrutinized which factors influenced the ranking of specific keywords. Upon success, the SEONATOR would pivot to another niche, repeating the process to validate the findings. This iterative approach refined the accuracy of the algorithm, culminating in a 75%+ accuracy rate by 2011. RANK emerged as a dominant force across all targeted niches.

Fast forward to 2024, Google’s algorithms have evolved into even greater complexity. While the SEONATOR now yields up to a 55%+ accuracy rate in competitive niches, RANKS expertise, not yet incorporated into the software, continues to achieve comparable results. By comprehending the inner workings of Google’s bots, RANK has empowered numerous businesses to thrive and prosper.