The penalties that Google search engine can impose on a site play a very important role in SEO. Just a few changes to the search algorithm in relation to your project can destroy any prospects for your online business. And all this is completely serious – many site owners do not take risks at the proper level until they come across this from their own experience. But then, as practice shows, it is already too late.
In this article, we will mainly talk about Google, as it leads the Ukrainian search market. But almost all the points discussed will be fair in relation to the promotion in Yandex.
The main types of filters
There are two types of penalties- automatic and manual. The imposition of penaltiesin manual mode is carried out by people – assessors – who check the search results for various requests and independently browse sites from the top for their quality.
In some cases, a specific resource may be considered “unworthy” of the top and moved from 5th to 15th place. But it’s much worse if the assessor understands and finds convincing evidence that the site uses prohibited methods of promotion, is generally of poor quality, and decides to impose penalties on it, which will lead to a global collapse of positions and loss of traffic. It can actually destroy a business.
Automatic filtering occurs if Google’s algorithms have determined that the site violates the quality guidelines for webmasters. For example, it actively buys low-quality paid links in order to gain first place in the search results.
The main thing we want to convey in this article is that it is important to do everything necessary to avoid getting under penalties and to minimize such risks in the future. Recovery can be very difficult and time consuming.
Google has two main algorithms responsible for imposing penalties that work in automatic mode:
- Penguin designed to fight link spam.
- Panda, which punishes duplicate low quality content.
Thus, we have two main reasons that can create a lot of problems when promoting a site. Although in fact there are a number of others that partially increase the risks. It is also important to know them, and we will talk about this further.
What can Google filter for?
For ordinary people, the panda is just a cute animal. In the world of SEO, this is a harsh diagnosis. And it is unlikely that any of the specialists in search engine optimization, once having received such an experience, would want to deal with this “terrible beast” again.
Low quality content
A fairly broad definition, since the concept of quality includes a number of points. But in almost all cases, the search engine refers to low-quality texts that have the following characteristics:
- Auto generation- Actually primarily for online stores, on the sites of which there are a large number of pages, and you need to somehow fill them with content. Then someone might come up with a brilliant idea – and let’s just generate the text according to the template, changing a few variables. Most likely, Google will appreciate such efforts deservedly;
- Duplicates– This includes content that is duplicated on several pages of the same site or simply taken from other sources without change. The practice of using such texts is definitely worth avoiding.
It is also worth avoiding the use of texts with many errors, for example, a document translated using machine services such as Google Translate.
The search engines not only learned how to track user behavior on site pages, but also take this into account as one of the ranking factors in search results. Such signals are called behavioral ranking factors.
The main user signals, the quality of which can affect the site, include the following:
- Low CTR snippets in the SERP- In simple words, we are talking about the low clickability of links to the pages of your site in the search. High positions do not guarantee that people will immediately go to the site, since its description can be simply unattractive and uninteresting. Google takes into account such signals, and over time this may affect the ranking;
- High bounce rate- If a user navigates to your site and then after a few seconds closes the tab or, even worse, returns to delivery again, this is considered a failure. You can read more about how to improve this indicator in the article “bounce rate, and ways to reduce it”;
- Short time on site- If there is little useful information on it, and the one that is of relatively low quality, it is not surprising that people will leave such a site. Such is the logic of search algorithms, and one cannot disagree with this;
- Low return rate- This does not apply to all topics and types of sites, however, if the content is interesting and published on the site over and over again, Google takes this into account.
The debate about whether to buy links or not has been going on for many years. However, as practice shows, the following conclusions can be drawn: links are good if they are really high-quality, and the dynamics of their growth looks as natural as possible. That is why it is worth avoiding the purchase of rental links when promoting – in this case, the probability of getting under penalties becomes very high.
Those who are going to ignore the rules of Google need to be prepared for the fact that there will be no mercy. And do not believe the “experts” who claim that paid links cannot be distinguished from natural ones. If you do everything anyhow, it is very possible.
As well as insufficient elaboration of the site and its pages, and excessive optimization can adversely affect the visibility in the search, traffic and positions. Usually we are talking about works of an internal nature, in particular a super saturation of texts with keywords in a large number (7-10%), as well as a large number of keywords in the Title/Description meta tags and H1 headings. And the matter is not even in the perception of the robot – a person is also simply impossible to read such a text.
Would you like to read like that? So it is not surprising that search engines punish such practices. Therefore, it is not necessary to oversaturate the text with keywords, it is enough to mention the necessary queries once in the text, headings and meta-description.
The process of restoring positions and site traffic after falling under the filters of search engines can be a long and expensive company. It will take enough perseverance and patience, because in the case of manual fines, you will need to submit applications for review, and not the fact that you will not receive a refusal. So it’s always better to avoid penalties than to correct the consequences later.