By analogy, SEO is a tool that can be used to successfully optimize the visibility of websites. Which in itself is a wonderful thing, isn’t it? Unfortunately, there are not only benevolent “souls”. While the SEO should be used to improve performance, some people use it to harm the visibility of websites on the web: This is the e Negative SEO or NSEO. The websites that are generally victims of this have direct competitors in the same niche who do not want all means to deteriorate their visibility.
As you can imagine, “negative” SEO is the opposite of “positive” SEO.
Rather than improving the performance of a website in SERPs, Negative SEO is a malicious attempt taken against a site with the aim of damaging its ranking.
The goal is therefore to see the website lose positions or be completely removed from search results following a search engine penalty. This practice is generally used to interfere with competition. The idea behind it is quite simple, if the competitors lose places, the website of the user of the NSEO will be all the more likely to emerge.
Like positive SEO, Negative SEO includes several techniques, namely:
- The creation of hundreds or thousands of spammy links to the website;
- Sending false requests to delete links;
- The creation of negative opinions for its competitors;
- The complete republication of content;
- Website hacking.
These are some of the methods used by unscrupulous SEOs, which we personally would describe as dishonest.
Why does Negative SEO exist?
At this level, we would not try to give all the reasons that would motivate a person to use the NSEO. We will rather focus on the level of competition which explains, in our opinion, most of these initiatives. Indeed, it is important to note that competition is very high on the web.
How to protect yourself
Preventing a negative SEO attack is not something very easy or predictable unless you can predict the future. Finding an attempt early enough to cancel it is totally feasible. To do this, you need to monitor the growth of your backlink profileregularly. Majestic or Ahrefs, for example, provide growth charts for both the number of links on your profile and the number of reference domains. An unusual spike in any of these graphs is reason enough to examine the links your site has suddenly acquired.
Scraper
Another negative off-page SEO technique is based on generating duplicate content. This technique involves scraping the content of your site which is then copied to other websites. In the worst cases these sites are part of the link farms discussed above. Web scraping is an IT technique of extracting data from a website by means of software.
How to protect yourself?
First of all try not to provide complete feed files (RSS), insert only the extract for each article, maybe the initial paragraph. This way, RSS scrapers will only receive a small portion of your content. If then, in the first paragraph, you manage to insert a link to your site, the distracted scraper will copy it as it is and Google will have more information to understand the authorship of the content.
If you really can’t reduce the feed file you have to monitor the copies constantly. There are some excellent tools designed for this purpose.
False reviews
In Local SEO, reviews mean a lot. An influx of negative reviews damages local visibility, as well as hurts your business in general. Reviews are easy to handle and may be the first thing your competitor will try to hamper your business. Probably the spammer ignores that, in fact, he is risking a sentence.
How to protect yourself?
You should always keep tabs on your Google My Business listing and look for new reviews of your company. Fake reviews may violate Google’s rules, which states that you should never “publish reviews on behalf of third parties or twist your identity or connection with the place you are reviewing”.
Invasive scans
When it can’t do better, a desperate competitor can try to crash your site by performing a forced scan. Excessive load on the web server could drop it, generating the fateful 5xx errors. If your web server doesn’t handle many concurrent requests, a simple multi-connection scan with Screaming Frog could bring it to its knees.
How to protect yourself?
If you notice your site is getting slower or worse, it freezes altogether, the best think you can do is analyze the web server log. Alternatively you can contact the webmaster or hosting company – they will tell you where the load is coming from. Eventually you will have to use robots.txt and file.htacces to block the door to the most invasive spiders/scraper. Each week also check for crawl errors in Google Search Console: check for new errors with status code 500 in particular.
Fraudulent clicks
Clicks in SERP are a controversial SEO signal; not everyone trusts that they can be a ranking factor. There are experiments that have proven that an unusually high click-through rate (CTR) on a given search result can improve its ranking; while a low CTR will have an opposite effect. This behaviour wants to communicate to Google that the visited page of the competitor is not relevant. In some cases this technique worked temporarily, Google downgraded the result until the end of the artificial searches.
How to protect yourself?
There is little to do to defend against this attack, if for no other reason the attack must be carried out for some time to take effect. The spammer will have to invest days and resources in this attack, in fact the results will disappear when the automatic queries end. What you can do is make sure to carefully monitor the CTR of the main keywords in the Google Search Console. Here you’ll find both your site’s overall CTR statistics across all keywords and click through rates for individual keywords.