Sanctions and filters Google superimposed on sites
The Googlebot has a User Agent - Googlebot , which is the main robot that scans the page content for the search index. In addition, there are several specialized robots:
- Googlebot-Mobile - a robot indexing sites for mobile devices, laptops
- Google Search Appliance (Google) gsa-crawler - the search robot of the new hardware-software complex Search Appliance,
- Googlebot-Image is a robot that scans pages for image indexes,
- Mediapartners-Google is a robot that scans the content of a page to determine the content of AdSense,
- Adsbot-Google is a robot that crawls content to evaluate the quality of AdWords landing pages.
Although I have experience in promoting sites and I know a lot not in theory, but in practice, but there is no experience with search engine sanctions, simply because I did not receive them. And in order to refresh and structure all the information for myself and for you friends, I decided to write the most common types, both manual and automatic sanctions from the search engine Google.
There are two types of sanctions, the first is manual, when Google's special employees check sites and are superimposed and automatic - the system itself imposes sanctions with the help of developed algorithms and also removes them when correcting problems of the site.
Now we will talk in more detail about each type of sanctions and how to avoid them or prevent them on our site.
Manual Google Sanctions
As I mentioned above, this is the type of sanctions that are imposed manually by special employees of the google search engine, how sites are selected for employee testing, these can be complaints of competitors about poor-quality promotion, etc. Also these are sites that have identified algorithms as unfairly promoted, but with doubt, that is, 50 to 50, that you need to ban or it's just a coincidence in algorithms, and in this situation, people who decide the site or no.
However paradoxical it may sound, but manual sanctions have a plus, unlike automated ones, it consists in the fact that you receive notification of imposed sanctions in the webmaster's bar. To view it, go to the section Search traffic to the section "Measures taken manually"
If you see a report of punishment there, immediately take measures to eliminate it. How to do it all depends on the individual case of each site. My goal is to tell you about the symptoms, and your goal is to fight the realities, if they suddenly appear.
Sandbox (Sand Box) filter from Google
Every second newly created site falls under this filter. But let's talk about its essence and principles of superposition. According to Google, every young site must go through its growing stage, which includes its smooth development, the growth of stable behavioral factors, a steadily growing, natural reference mass, etc. There are no exact dates for the sandbox, but there is an opinion that the filter is valid from 4 to 6 months.
How to avoid getting under this filter, and how to properly develop the site, so as not to have problems with the sandbox. Speaking correctly, everything passes through the sandbox, only the time it takes for a PS to slow down your site completely depends on you.
The deadline depends solely on the quality of the resource itself, and (most importantly) the requests for which it is moving. The more competitive the query, the longer the site will have to sit in the "sandbox". Typically, this applies to commercial resources.
However, in the "sandbox" may be and older sites, which for one reason or another have lost the credibility of Google. This can happen because of non-unique content, and because of the placement of many direct non-topic natural and corrupt links, as well as links to questionable resources.
The most effective way to overcome the sandbox faster is to optimize the site exclusively under the MF and LF, the first 2-4 months, and then take on the HF. But of course this is my opinion and experience, how do you decide for you.
Penguin Google filter
I will say right away that the filter, in addition to the automatic mode, is sometimes superimposed in manual mode. The work of the Penguin filter is to combat the low-quality reference weight of the site, I would not say natural referential. You can write a whole book about this filter, where you can talk about the cases of getting out of the penguin and how to build the reference mass correctly in the current realities. I will only analyze a couple of nuances that will help you understand this insidious little creature from Google more.
- Avoid a large number of links in which the anchor direct entry of the promoted key
- Do not post links anywhere, especially for temporary links
- Beware of end-to-end links from footer and site caps
- Do not put links only to promoted pages, distribute throughout the site
- The reference mass build up smoothly and stably, no jumps
Panda (Panda) Google filter
This filter was launched in the work of 2011, as the content spam of pages could no longer continue on such a scale. What is this filter applied to sites:
- Non-unique content (stolen)
- For oversampling the keys in the text
- For duplicates on the site
- For non-topic ads on the site page
- The excessive mention of the key throughout the page (title, h1, h2, alt, strong, em)
There is an opinion that the presence of broken links on the site, those same 404 can also cause Pandochka. To avoid this filter, all that is required of you is to conduct a quality comprehensive audit of your site. If there is no experience or understanding how to do this, contact the specialists who understand this.
Additional results (Supplementary Results) Google filter
- So the principle of this filter is to search and exclude with the main issue of Google pages of your site. With what it can be connected?
- The page does not respond or does not disclose the request under which it is promoted or written
- All items of the Panda filter features
- Frequent moving of the page for different headings, replacement of url address for another
How do you know if your site is in additional search results or not? It's very easy to check - enter this in the search engine: site: http: //www.site.com/& (just do not forget to replace the site with your domain ... ;-)
Bombing Google filter
This type of filter is superimposed on the site for the use of a large number of the same anchor texts in the buildup of the reference mass. Even if you buy 100 links from one hundred different domains, but the anchor will be at 80% the same key, you are more likely to get this filter. Such links are no longer taken into account, are reset to zero and your site is not raised in the issue for this request.
In order to avoid the risk of getting under such a filter, I recommend that you make a reasonable purchase of the reference mass, and monitor your anchor list.
Florida filter Google
Punishment with this filter gets the sites for re-optimization, we can say that Panda, this is the continuation of this filter, since the Florida filter itself was launched in 2003, and by 2010 was not able to fight against poor-quality content.
Basically, the punishment followed the spam with the title, h1-h6 keys, and this once again emphasizes that it is necessary to carry out the internal optimization of the site with the mind. No one says that you do not need to optimize the articles, just do it without fanaticism. And think, first of all, about your readers and their convenience.
Broken Links Google Filter
The filter is superimposed on a large number of broken 404 links and their active growth. I had a case when the developed site changed the address and the link futera began to give 404 the answer, as a result, each new created page produced another bit of a link from my site. Good technical audit helped to find out the reason and quickly remove it.
So that you do not have such a situation, we periodically check the webmasters' panel, there are reports of broken links, or you can go through the free Xenu program, which will point to the broken links of your site and not only.
Links (Google) filter Google
Too many outbound links from the pages of your site. That is, when the number of outgoing calls exceeds the number of incoming links, which speaks of the site as being corrupt in terms of links or not quality for people due to the lack of incoming reference mass.
The methods of output from under the "Links" filter are very simple. If there are more than 25 outbound links on the page, you just need to remove or hide these pages from indexing. Or, the number of incoming fewer times the number of outbound links, this indicator can be viewed in the RDS plug-in to the browser or other similar checkers.
Site Load Time (Page Load Time) Google filter
The name speaks for itself, the filter is superimposed on the site, which when a robot accesses to scan pages, receives a very long response from the server. This leads to a decrease in your site in the issuance, for more nimble sites. Now in the webmasters bar there is a service for checking the speed of the site load, with recommendations for improving it, do not neglect it. I provide you with a link to this PageSpeed Insights service.
This is not all the filters that exist in the Google search engine, but I chose the most common and those that most site owners forget or do not know. Know to avoid getting under the filter, you need to know what it can be imposed for. And ignorance, this advance into the blind, maybe will carry!
Also I did not write about such things as verification of the validity of your layout, site adaptability, micro-markup and so on, which also directly or indirectly affects the site's position in the issuance. I hope you will get useful information for yourself and be more careful in terms of promoting your own or others' resources. Successful promotion and a video on the topic of filters to consolidate knowledge)))
Via sozdaj-sam.com & wiki