Table of Contents
Cloaking is a web designer trick in which a page will appear differently if the search engine bot wants to see it or if a human clicks on it.
Adding keywords to a page multiple times in the past could help it rank better. This was called “keyword spamming.”
It was spammers’ tactic to give search engines the search terms they were looking for to rank the page.
The web page would look nice and normal for human users, so it would convert more because it didn’t look spammy.
Google’s guidelines and cloaking:
When asked whether blocking Google from the ad-block detection script would make Google regard this as cloaking, Google’s John Mueller explains what cloaking is and why blocking Google from the ad-blocker detection script isn’t cloaking.
Google search central guidelines explains what cloaking is in the following statement:
“Serving a page of HTML text to search engines, while showing a page of images to users. Inserting text or keywords into a page only when the user agent that’s requesting the page is a search engine, not a human visitor”
What this means?
The question that John Mueller was asked was this:
“We have a site that is considering adding ad blocker detection to prevent users from accessing the site whenever the ad blocker is on. The question here is, if we decide to exclude Googlebot from seeing the ad block detection, will we be flagged for cloaking in that situation?”
This situation is not really about showing different content to users than Google.
It is rather about establishing multiple user/site visitor statuses.
Users with no ad blockers have higher privileges with regard to reading the content. Users with ad blockers are denied access.