Pixelette Technologies

Google on Partial and Total Site Deindexing.

Mueller discussed site Deindexing issues
Google’s Mueller discusses site Deindexing issues.

Google’s John Mueller answered a question from someone whose site was deindexed and lost their ranking. Mueller provided a list of the technical issues that can lead to Google degrading a site and removing it from the search results. Mueller discusses two kinds of deindexing, slow and fast.

It’s reasonable that Mueller did not provide the person asking the question with a direct answer specific to their website because SEO office-hour hangouts are not the place to ask for a diagnosis for a specific website. This was the question: “I own a site and it was ranking good before 23rd of March. I upgraded from Yoast SEO… free to premium. After that the site got deindexed from Google and we lost all our keywords.”

The Question:

The person asking the question noticed that the keywords would appear on the website for several hours, but then vanish. They claimed they checked Robots.txt, checked the sitemaps, and verified that there were no manual penalties. One thing he neglected to mention was checking for Robots Noindex meta tags. It may be reasonable to begin with the Yoast plugin and examine its configuration.

Many people have experienced a situation where the free version of the Yoast SEO plugin caused pages to noindex. This was the result of the installed Yoast SEO Plugin. Mueller responded to this by saying: “I don’t know… it sounds kind of tricky… I would say offhand it probably doesn’t have to do with the updating of your plugin.”

Mueller’s response:

Mueller also discusses the different ways that Google removes websites from their search results. Mueller also discusses a long deindexing scenario where parts of a site are slowly deindexed as Google does not consider them relevant.

Mueller said:“But it could very well be a technical issue somewhere.Because usually… when we reduce the indexing of a site, when we say we don’t need to have as many URLs indexed from a website, we tend to keep the… URLs that are more relevant for that site and that tends to be something that happens over… I don’t know… this longer period of time where it like slowly changes the indexing.” The next thing Mueller explained is the possible reasons that a site might experience a complete deindexing.

The takeaway:

Further, Mueller suggests using the Google Search Console to help diagnose the specific issue. Mueller suggests that this issue could be a technical issue, site quality issues, spam issues, possibly a hacking infraction. “So if you’re seeing something where like the whole site disappears from indexing, it almost sounds like something that might be related to a technical issue… something along those lines.”

You should check not only the Robots.txt file, but also the source code of the individual pages themselves to be sure there isn’t a rogue noindex meta description blocker blocking Google from indexing the web page. As Mueller noted, there are ways a website could be deindexed beyond the accidentally changed robots.txt or the robots meta tag.

Reasons such as a hacking issue or another technical issue that would affect Google shouldn’t be dismissed without investigation.

Leave a Comment

Your email address will not be published. Required fields are marked *

Recent Posts

SUBSCRIBE FOR NEWSLETTER

Topic(s) Of Interest

Social Share

Share this post with your friends, if you found our content interesting.

× How can we help you?
Pixelette Technologies Lead Generation - SEO - SMM - Web Development has 4.54 out of 5 stars 52 Reviews on ProvenExpert.com