Google sanctions and filters imposed on sites
Googlebot has a User Agent, Googlebot , which is the main robot that scans the content of a page for a search index. In addition to it, there are several specialized robots:
- Googlebot-Mobile - a robot that index websites for mobile devices, laptops
- Google Search Appliance (Google) gsa-crawler - the search robot of the new Search Appliance hardware and software system,
- Googlebot-Image - a robot that scans the pages for an index of images,
- Mediapartners-Google - a robot that scans the content of a page to determine AdSense content,
- Adsbot-Google is a robot that scans content to evaluate the quality of AdWords landing pages.
Although I have experience in website promotion and I know a lot not by theory, but in practice, but there are no experience with search engine sanctions, simply because I did not get them. And in order to refresh and structure all the information for myself and for you friends, I decided to write the most common types, both manual and automatic sanctions from the Google search engine.
There are two types of sanctions, the first ones are manual, when Google’s special employees check sites and overlay with hands and automatic ones - the system itself with the help of the developed algorithms imposes sanctions and also removes them when correcting the site’s problems.
Now let's talk in more detail about each type of sanctions and how to avoid them or prevent them on your site.
Google manual sanctions
As I mentioned above, this is the type of sanctions that are imposed in manual mode by special staff of the google search engine, how sites are selected for review by employees, this may be complaints of competitors about poor-quality promotion, etc. These are also sites that have defined the algorithms as dishonestly promoted, but with doubt, that is, 50 to 50, that you need to ban or this is just a coincidence in the algorithms, and in this situation, people who work under the sanctions site or not.
No matter how paradoxical it may sound, but in manual sanctions there is a plus, in contrast to automated ones, it lies in the fact that you are notified about the imposed sanctions in the webmasters panel. You can see it by going to the Search traffic section in the section “Measures taken by hand”
If you see a message about punishment there, take immediate corrective action. How to do it all depends on the individual case of each site. My goal is to talk about the symptoms, and your goal is to struggle with the realities, if they suddenly appear.
Sandbox (Sand Box) filter from Google
Every second newly created site falls under this filter. But let's talk about its essence and the principles of imposing. According to Google, every young website should go through a stage of its maturation, which includes its smooth development, building up stable behavioral factors, steadily growing, natural reference mass, etc. There are no exact dates in the sandbox, but it is believed that the filter is valid from 4 to 6 months.
How to avoid getting under this filter, and how to properly develop the site, so as not to have problems with the sandbox. Speaking correctly, everything passes through the sandbox; only the time how much PS will specifically slow down your site depends on you.
The term depends solely on the quality of the resource itself, as well as (most importantly) of the requests for which it is promoted. The more competitive the query, the longer the site will have to sit in the sandbox. As a rule, it concerns commercial resources.
However, middle-aged sites that for one reason or another have lost the trust of Google may also end up in the sandbox. This can happen because of non-unique content, and because of the placement of many direct non-thematic natural and sales links, as well as links to questionable resources.
The most effective way to overcome the sandbox faster is to optimize the site exclusively for MF and LF, the first 2-4 months, and then take up the HF. But of course it is my opinion and experience, what you do is up to you.
Penguin google filter
I have to say that in addition to the automatic mode, the filter is sometimes applied in manual mode. The job of the Penguin filter is to combat the poor quality reference weight of the site, I would say not the natural reference. You can write a whole book about this filter, where you can talk about cases coming out of a penguin and how to build a link mass in the current realities. I will analyze only a couple of nuances that will help you more to understand this cunning little animal from Google.
- Avoid a large number of links that have a direct entry of the promoted key in the anchor
- Do not post links anywhere, especially for temporary links.
- Beware of cross-references from footer and site caps
- Do not put links only to promoted pages, distribute throughout the site.
- Increase the reference mass smoothly and steadily, no jumps
Panda (Google filter) Google
This filter was launched in 2011, as the spamming of the contents of the pages could no longer continue on this scale. What this filter is applied to sites:
- Non-unique content (stolen)
- For rewriting the keys in the text
- For doubles on the site
- For non-thematic advertising on the site page
- Unnecessary mention of the key across the page (title, h1, h2, alt, strong, em)
It is believed that the presence of broken links on the site, those 404 can also cause a Pandochka. To avoid this filter, all that is required of you is to conduct a high-quality comprehensive audit of your site. If there is no experience or understanding how to do this, contact the experts who understand this.
Supplementary Results Google Filter
- So the principle of this filter is to search and exclude from the main issue of Google pages of your site. What can it be connected with?
- The page does not respond or does not disclose the request for which it is being promoted or was written
- All items are signs of the filter Panda
- Frequent moving of the page under different headings, replacing the url address with another one
How to find out if your site is in additional search results or not? It is very easy to check - enter this into the search engine: site: http: //www.site.com/& (just do not forget to replace the word site with your domain ... ;-)
Bombing Google filter
This type of filter is superimposed on the site for the use of a large number of the same anchor-texts in building up the reference mass. Even if you buy 100 links from a hundred different domains, but the anchor will be in 80% the same key, you are more likely to get this filter. Such links are no longer taken into account, they are reset and your website simply does not rise in the issue.
In order to avoid the risk of falling under such a filter, I recommend you wisely carry out the purchase of reference mass, and follow your anchor sheet.
Florida (Florida) Google filter
This filter is punished by sites for re-optimization, it can be said that Panda is a continuation of this filter, since the Florida filter itself was launched in 2003, and by 2010 it was not able to fight low-quality content.
Basically, the punishment followed the rewriting keys title, h1-h6, and this again emphasizes that it is necessary to carry out an internal optimization of the site. No one says that there is no need to optimize the articles, just do it without fanaticism. And think, first of all, about your readers and their convenience.
Broken Links Google filter
The filter is superimposed for a large number of 404 broken links and their active growth. I had a case where the developed site changed the address and the link f footer began to give 404 answers, as a result, each new page created created the next broken link from my site. Fortunately, the technical audit helped find out the cause and quickly eliminate it.
So that you do not have a similar situation, we periodically check the webmasters' panel, there are messages about broken links, or you can go through the site with the free Xenu program, which will indicate broken links of your site and not only.
Link Cleaner (Links) Google Filter
Too many outbound links from the pages of your site. That is, when the number of outgoing calls exceeds the number of incoming links, which speaks of the site as being corrupt in terms of links or not good for people due to the lack of an incoming link mass.
Ways of output from the filter "Links" is very simple. If there are more than 25 outbound links on a page, you just need to remove such pages or hide them from indexing. Or the number of incoming links is less than the outgoing links, this indicator can be viewed in the RDS plugin to the browser or other similar checkers.
Site Load Speed (Page Load Time) Google Filter
The name speaks for itself, the filter is superimposed on the site, which, when the robot accesses the page, receives a very long response from the server. This leads to a decrease in your site in the issue, for more nimble sites. Now in the panel of webmasters there is a service for checking the speed of loading the site, and with recommendations for improving it, do not neglect this. I give you a link to this service PageSpeed Insights .
These are not all filters that exist in the Google search engine, but I chose the most common and those that most site owners forget or don’t know. Know to avoid getting under the filter, you need to know what it can be applied for. And ignorance, this advance in the blind, maybe blow over!
Also, I did not write about such things as checking the validity of your layout, adaptability of the site, micromarking, and so on, which also directly or indirectly affects the position of the site in the issue. I hope you will gather useful information for yourself and be more careful in terms of promoting your own or others' resources. Successful promotion and video on filters to consolidate knowledge)))
Via sozdaj-sam.com & wiki