Google penalties are extremely unpleasant for website owners, especially those who rely on organic search for income. This guide summarizes all major reasons for algorithmic and manual Google penalties, gives recovery tips, and prevention measures.
Google penalties are sanctions that Google imposes on websites for bad SEO practices. They lead to lower organic rankings and online visibility or even total removal of a site’s pages from the search results (or SERPs). Google penalties can be applied at the level of a separate query (a bunch of queries), a separate URL (several URLs, entire directory, subdomain, etc.), or sitewide.
Google penalties can be sorted into the following types:
By the scope of impact, Google penalties are split into:
Google watches after search quality with the help of algorithms as well as human quality raters. Besides, there is a tool for anyone to report malicious content, phishing, paid links, or spammy content.
If you happen to do any black-hat or gray-hat SEO and get a penalty, it is urgent to fix the issue. Even if only a part of your pages have been affected, the overall authority of the domain will decrease, and it will struggle to rank for the rest of the queries.
However, even after you remove the content or links which led to a sanction from Google, this does not guarantee that all your rankings will be restored to the level before the penalty. So, it is better safe than sorry – check out why Google penalties may occur and prevent them.
Detecting manual penalties is dead simple: check the Manual actions box in Google Search Console. On a healthy website, you will see a statement saying everything is fine. To find out if there had been any penal notices in the past, you can review the messages under the bell icon in the top right corner.
Bad luck, and you’ll see the red penal notice stating what issues have been detected.
As is often the case with manual actions, SEOs know who to blame. The problem can occur because of paid links, keyword stuffing, auto-generated content – anything that has been added to pages on purpose to manipulate rankings.
To detect partial or algorithmic penalties (for which no notifications are handed down), you need to observe rankings regularly. Let’s see how to set up regular tracking to observe anomalies in Rank Tracker (note that you will need a Professional or Enterprise license).
Step 1. Launch the tool and add your site’s URL to create a project.
Step 2. Add your target keywords to Rank Tracking and map the ranking URLs as landing pages.
Step 3. Turn on Recording the SERP history to record the top 30 results with each ranking check.
Step 4. Additionally, you can track the rankings of your major competitors for the target keywords. This way, you can quickly see if the ranking impact is industry-wise or occurs only on your website.
Step 5. Add an automatic task to check rankings regularly. The tool will stay in standby mode and will check ranks on autopilot on the set date.
Rank tracking will give you a sufficient set of data to diagnose Google penalties.
In the tracking tool, examine if there is a sharp drop in rankings for your target keywords.
If yes, compare those with your closest competitors to check if they are experiencing the same.
Also, check out the Fluctuation Graph under SERP details. When you notice high volatility in search results, most likely, Google is updating something in its algos.
Unless it is a massive Google update, audit your site to spot probable technical issues that might be affecting its performance.
Recovering from a manual penalty depends on each particular case, its difficulty and types of fixes to do. It may take from a couple of weeks to several months for a site to recover.
The penal notice in the Console usually gives a hint about what the issues are and in which way they violate Google’s Search Essentials (formerly Webmaster Guidelines). However, SEO specialists need time first to investigate them in detail.
Next, they submit a request to review the sanctions in the Search Console. And it will take time for Google to consider the request, so it is better to make it simple. A good request:
Then, the Console will show the review status message to let know whether your request is considered and the penalty is revoked.
Most requests submitted to Google are either approved or rejected. Sometimes, the message has it that the request is being processed, and the penalty remains for some other violations that have not been fixed yet.
With algorithmic penalties, things are less clear because it is sometimes hard to identify when a specific penalty is in action. You will first need to audit your site and content to find out the causes.
Google updates often cause ranking issues. When your rankings don’t ping back in a short while after an update, probably, it is an algorithmic penalty being applied for a certain issue on the website.
As John Mueller said in Google SEO office hours:
“It’s generally a good idea to clean up low-quality content or spammy content that you may have created in the past. For algorithmic actions, it can take us several months to reevaluate your site again to determine that it’s no longer spammy.”
According to Google, if a site has lost rankings after a core update, this might be linked to E-E-A-T issues. That is, the website generally lacks authority, expertise, trust, or experience. And it may take months to establish these signals.
Let’s take a closer look at what bad practices may end up with a Google penalty and why you should avoid them.
Keyword stuffing is an outdated SEO tactic of inserting a lot of keywords to rank a page artificially. Stuffed key phrases often appear on a page unnaturally or out of context, sometimes in the form of listings, and this is a confirmed negative SEO ranking factor.
Keyword stuffing often goes together with deceptive SEO techniques, such as cloaking, content spinning, and so on, which altogether are a surefire way to sanctions.
The Panda algorithm penalizes over-optimized content, and instead of ranking a stuffed page up, it will do just the opposite. When you know there are no black-hat techniques implemented on your pages, then carefully review your content. You can get help from Content Editor to create properly optimized pages.Download WebSite Auditor
Thin content is the one that gives little or no value to the user. It is not a problem to have a couple of poor-quality pages. But if a site has a lot of pages that are not useful for visitors, algorithms will affect their rankings.
The reasons for thin content can be different. For example, a site is hastily generated with the help of automatic tools. One more example is an e-commerce website with thin product pages that lack item descriptions. Affiliate websites or blogs may also appear to be thin content when they lack E-E-A-T signals.
First, identify pages with thin content. The easiest way is to find pages with a low word count which may mean that the pages give little value to users.
In WebSite Auditor, in the Site Audit module, filter all Pages by Word count and examine those with the fewest number of words. Actually, there is no any strict range for an ideal word count: it depends on the type of page and the goal for which it has been created.Download WebSite Auditor
Second, find poorly written blog posts that don’t bring any value. For instance, in Google Analytics, look for pages that bring no organic traffic or get the highest bounce rates.
Rewrite the affected pages to create an in-depth post meeting the search intent of your visitors. Follow the optimization advice from the Content Audit section.
You can also use Content Editor here to find popular related topics and frequently asked questions that add more information to your topic. Among other things, this tool shows the minimal and maximum word count on the best ranking pages for your target query.
The thing about affiliate websites is that sometimes they are just copies of the merchant website, adding no value to users. The search engine, finding similar templated websites, clusters them together and ranks the main one out of the bunch.
If you are doing affiliate marketing, stick to the best practices:
It may come as a surprise, but there is no penalty for duplicate content as such, "at least, not in the way most people mean when they say that", says Google.
The whole problem with duplications is about wasting your site crawl budget. Search bots may get confused about which URL to present to the user. This way, unnecessary pages appear in searches, visitors get upset and, consequently, your website rankings and traffic are damaged.
Things are rather simple with external duplications. Google penalizes scraped or syndicated content identical to the whole original website.
"The only time we would have something like a penalty or an algorithmic action or manual action is when the whole website is purely duplicated content …if it’s one website that is scraping other websites for example.”
Thus, to prevent duplication penalties, watch out for cases when someone creates a copy of your website somewhere on the web. For the rest, keep your site without duplicates following best practices.
For sitewide duplication prevention that will not cause a manual penalty but may impact your rankings, make sure that:
Spammy auto-generated content is content that has been programmatically generated and lacks coherence or adds little value for users. Such content is created with the sole purpose of manipulating search rankings, no wonder Google has put much effort into eliminating spam from SERPs.
Examples of spammy auto-generated content that may get penalized include:
The story with automatically generated content has got an interesting twist after the arrival of the GPT-3 technology and Open AI. This tool can generate awesome texts looking pretty close to human-written content. But does it ensure that such auto-generated content will not be treated as spam?
There are two problems here. First, it is doubtful so far that we can rely on the accuracy of AI-generated content. The question is whether website editors who resort to AI content generation will pay due attention to verification (especially this concerns recent facts, emerging trends, news, or anything about what AI does not “know” yet for sure).
And second, with AI tools, content quantity might grow exponentially. So, it will require much more capacity to process it, and Google is to face the challenge.
If you have a lot of auto-generated content that has resulted in a Google penalty, there are several steps to handle it:
Usually, user spam appears in comments under blog posts, on forums, in popular social media accounts, etc. Such comments are often generated by automatic tools only to acquire SEO backlinks.
User-generated spam harms a site’s quality because it dilutes PageRank and is often irrelevant to the main content. Too many such comments sitewide may lead the site straight under a Google penalty.
In case you detect unnatural comments on your pages, simply delete them. But it’s easier to prevent spam rather than clean it up later. Some platforms even prefer to close down comment sections if they lack the resources to monitor them.
Yet, if you need comments on your website, there are a few best practices to prevent user-generated spam:
Doorway (sometimes they are called jump pages or bridge pages) is a spamdexing technique in which intermediate pages are optimized to rank for specific similar queries and are not as useful as the final destination. In other words, these are doors that take up a lot of SERP estate and funnel users to one website.
From this description, truly harmful doorways meet the following criteria:
Doorways disrupt the search experience and mislead users, so search engines try to spot and penalize them.
For intentional doorways, the only advice is not to do it. So, if you created doorway pages and got hit by a penalty, remove any of those with noindex tags and remove them from the sitemap. Instead, create valuable content that meets searchers’ intent and apply all the best SEO practices to rank high.
Also, audit your redirects to make sure that they lead to the right destination. You can quickly get a list of all your 301/302 redirects in the Site Audit section in WebSite Auditor.
Cloaking is another black-hat SEO technique in which users and search engine bots see different content on the same page. Simply put, a page ranks for one set of keywords, which is easier to rank for, but shows something else.
An old-school cloaking method is to hide keywords or links on a page with the help of color, size, CSS styles, etc. Some more sophisticated forms of cloaking are implemented by identifying the user agent, the IP, the referrer, etc., to serve different webpage versions to a human and Googlebot.
Since cloaking violates Webmaster Guidelines, it might lead to a manual penalty. Occasionally, cloaked pages still appear in searches. How come?
Back in 2018 and now in 2023, Google reconfirmed it was able to recognize invisible text and ignore it. According to John Mueller, if a page with hidden text ranks, this might be for other reasons.
And experiments show that hidden text would not boost rankings for those hidden words, so it is simply useless.
There are cases when serving a slightly different version of content is appropriate and is not considered as cloaking. For example, paywalls, in essence, show different stuff to users and search engines. Google provides flexible sampling guidelines for paywalled content and supports structured data to differentiate it from cloaking.
First of all, in Search Console, you can examine your page with the URL Inspection tool to check how Googlebot sees it.
Alternatively, use WebSite Auditor to analyze your pages and detect invisible elements. In the advanced Project Settings, you can pick the Googlebot or Googlebot-Mobile crawler to examine the contents of your pages. Probably, it is better to ask your technician to find and delete inappropriate scripts.Download WebSite Auditor
For over a decade, Google has been finding ways to penalize manipulative link practices – that is how Penguin and SpamBrain algorithms appeared. Yet, here and there, we hear someone saying that links do work, including paid links.
Google tells us that it can differentiate and ignore bad links. Yet, link spam updates as well as the potential of earning a manual action serve as a good reminder: webmasters need to watch out for the quality of their websites' link profiles.
Here is a list of signals implying that a site might be involved in manipulative link-building practices:
You can use SEO SpyGlass to assess backlinks pointing to a website and evaluate the potential risk of getting a penalty. Besides, you can integrate other link sources, including Search Console.Download SEO SpyGlass
The backlink checker evaluates the quality of a link and calculates the Penalty Risk score that considers factors like domain age, anchor text, the number and quality of incoming backlinks, sitewide links, Page/Domain Authority, IP diversity, etc.
So, to fix a penalty for suspicious links, do the following:
The search engine looks after both incoming and outgoing links. Citing reliable sources can be an additional signal of trust for people. Meanwhile, excessive outbound links (especially from the same directory to irrelevant websites) may mean that the website is selling links.
Audit your links in All Pages > In Link Rank tab in Website Auditor. InLink Rank is a metric that estimates the importance of a page based on the number and quality of links, both incoming and outgoing. URLs with a low InLink Rank might mean there can be an issue with links on the page.
You can select each URL (especially those with the largest number of links) and examine Links from page in the lower half of the workspace. There is a quick filter to sort all External links.Download WebSite Auditor
If you’ve found suspicious outgoing links which are irrelevant to content, too numerous, etc., tag them as nofollow or remove them.
Mind that there is nothing wrong with having paid links, but in this case, you should mark them as sponsored.
Structured data is a powerful tool used to enhance results and grasp more estate on SERPs. Yet, mistakes in structured data, purposeful or occasional ones, may lead to manual penalties.
Google documentation states that a structured data manual action means that a page loses eligibility for appearance as a rich result. However, Google has recently clarified that a manual action involving structured data might also impact rankings.
It is not always easy to pinpoint schema mistakes because even if the Structured Data testing tool validated your markup, it does not mean that everything is fine.
So, the typical markup errors that may end up with penalties include the following:
Search Console shows markup errors in the Enhancements section. These errors can impact your SERP appearance and lead to losing rich features and tons of traffic, so they need a fix.
Each type of structured data has its own technical and search guidelines, so make sure to read them carefully before implementing the markup. And you can consult our Schema markup guide for more details.
In case you’ve faced a penalty, review your markup and fix the issues. WebSite Auditor will help you collect the list of all pages with a markup on them. Go to Site Structure > Pages and see the Open graph & structured data tab.Download WebSite Auditor
The first penalties for intrusive ads appeared with the Page Layout algorithm update in 2012. The algorithm penalized websites for excessive static ads above the fold. Later on, the Intrusive Interstitials algorithm added popups and overlay ads to the list of no-go design practices.
The penalties are applied algorithmically once intrusive interstitials appear on a page. Generally, you will notice a dip in both impressions and clicks – and a rollback after you remove the interfering element.
First and foremost, avoid intrusive interstitials on your site. Stick to the best practices in your page design, both for desktop and mobile devices. For example, delay pop-ups and other interaction buttons until the visitor decides to leave.
Also, check your Page Experience report in Google’s Console (all your pages must be Good URLs, ideally). And use WebSite Auditor to audit all pages in bulk and ensure that each page gets the highest page Core Web Vitals scores possible.
A large chunk of News and Discover content may get penalized for misinformation and policy violations. This mostly refers to sensitive YMYL topics, which require more evidence and accuracy, such as news websites, healthcare, finances, etc.
The only way to fix misleading content issues is to remove what caused the issue and request reconsideration.
To prevent penalties on your YMYL sites, make sure you do not publish:
There also might appear penal notices stating that a site shares spammy free hosting. Even if this particular website does nothing wrong, it may get a penal notice saying that the free hosting is being abused by third-party spammers.
As a result, the website will lose visibility, rankings, and search features. However, it will still stay indexed.
Opt for secure hosting. Here, we’ve got a brief roundup of all aspects of choosing reliable hosting.
Hacked content means that some content or code has been added to a site without the knowledge of the owner because of breaches in the website’s security. As a result, the site may contain hidden links, sneaky redirects, deceptive pages, etc.
Hacked websites are used in different forms of cybercrime. The problem is that such breaches harm not only the hacked website but also its visitors.
Websites with the hacked content penalty are not delisted on Google, but they have a warning about a potential threat next to the URL in the SERP. After clicking on the URL, the user will see a full-screen warning saying that the site is potentially harmful.
Google sends a message to the site owner about the hacked content along with the URL affected. Unlike with all other manual penalties, the warning appears in the Security issues tab.
Besides, anyone can check a site's status regarding unsafe content in Google’s Transparency Report. Remarkably, the overall report shows that the number of search security warnings has drastically dropped as compared to five years ago.
If your site has got a security issue alert, first, you will need to scan the website to detect the vulnerability and eliminate it.
Next, to level up your website’s security, consider the following tips:
This list of Google penalties was not meant to be exhaustive. It rather shows different kinds of issues that may cause a sanction and how to fix them all. So, if you happen to get a penalty, you’ll know what to do (but I hope you won’t get any because white-hat SEO is the best, right?)