Marketing 3 min read

Google Explains why a Site Might Gradually Lose Ranking

duangphorn wiriya / Shutterstock.com

duangphorn wiriya / Shutterstock.com

In a recent Webmaster Hangout, Google‘s John Mueller explained the possible reasons why a website could gradually lose ranking.

A publisher reported a gradual month to month drop in search traffic and was wondering what was responsible. The individual wasn’t sure if the backlinks — 50 percent from a single domain — or the auto-generated content was to blame.

However, Mueller points out that links and content are not always to blame. The webmaster explained that it’s a more general thing, especially when the gradual drop persists for an extended period.

Mueller said:

“And that generally wouldn’t be a sign that there’s this one thing that you’re doing wrong, which kind of made everything blow up if you see these kinds of granular step by step changes over a longer period.”

So, what’s responsible for the search traffic decline?

Why Websites Gradually Lose Ranking

John Mueller’s answer suggested that four issues can lead to a gradual decline in search engine ranking. These include:

  1. Changes in the ecosystem
  2. Algorithm changes
  3. Users are now searching in different ways
  4. Users now expect different content in a search result

John Mueller did not explain what “ecosystem change” means. However, he might be implying that something outside of the site is causing a traffic decline.

Link Rot is an example of such an external cause. Link Rot refers to the constant and natural disappearance of links. And it could occur when a site goes offline or when a site owner removes a webpage.

Increased competition could also lead to a change in the ecosystem. Competitors that improve their promotional activities will enjoy a higher search engine ranking. But that also means someone else’s site will go down.

Algorithm changes are gradually improving how the search engine understands and classifies search queries and webpages. From the August 2018 core update to BERT, updates are now focusing on user intent and if web pages provide relevant results.

In the end, John Mueller advised the publisher to take a step back and take a second look at the site. Maybe he or she could identify things to improve.

“So, that to me would be something where I try to take a step back and try to take a look at the website in general overall. And find areas where you can make significant improvements to kind of like turn the tide around a little bit and make sure that your site becomes more relevant or becomes significantly more relevant for the kinds of users you’re trying to target.”

Read More: Google Is Testing Ways to Highlight Image Licensing Information

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Sumbo Bello know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Sumbo Bello

Sumbo Bello is a creative writer who enjoys creating data-driven content for news sites. In his spare time, he plays basketball and listens to Coldplay.

Comments (0)
Most Recent most recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.