Deception of search engines or increase the relevance of the site
To date, finding the right information on the Internet is getting harder and harder. The reason for this is a huge increase in the number of sites and even greater growth of information debris, various advertisements. How often did you have to make a request in your favorite search engine to find the last song of your favorite artist or the best xxx site? And how often did you have to observe that the result of the search is completely unrelated to the query? Instead of a new song you are offered to enter the financial pyramid, and instead of the best xxx site, order a girl at extremely low prices with home delivery. All this is the result of deception of search robots. The ways of cheating and increasing the ranking of the site in search engines are told in this article.
Here is a typical example of the work of an ordinary search engine. There is a program module (spider, spider), which roams the links, reads the contents of the pages, and for each word makes an entry in the index file. For example, for the word "freebie" will be created about this entry in the index file: "freebie1". Then, in the file where the links are stored, the entry "1 URL of the page" will be made. Explanation: 1 is the number that links records in the index file (table) and the reference file. Then the spider will crawl to another page and stumble there again on the word "freebie". Now in the index table it will create a record: "freebie12", and in the reference table: "2 URLs of the page". When the user types the word "freebie" in the search line, the search engine looks at the index file, finds there a line of "freebie", reads the numbers 12 and finds the address table corresponding to the numbers 1 and 2 in the reference table and gives them to the user. Here is the basic principle of the search engines, which is called indexing. What then determines the position of the sites as a result of the search? Answer: from relevance, i.e. from the correspondence of the document to the user's request. What does relevance depend on? In general, relevance evaluation algorithms differ in different search engines, and are kept in the strictest secrecy. Here are the main parameters:
- The number of duplicate words in the document.
- Keywords in tags,,,,. Those. if the page is connected with freebies, then it is better to write the word "freebie" between the tags,, and in the future text to select this word.
- The distance between the keywords in the document. The smaller the distance, the higher the relevance.
- The citation index is a quantity that indicates the number of links from other resources to this site. The more sites refer to this resource, the more citation index. It is important and popularity of the site from which the link goes.
- No less important parameter: the thickness of the purse of the resource owner. Search engines are made by people who also want to eat, drink beer, buy the magazine "Hacker". And they show ads directly in the search results. The paid links appearing in the top line of the search result are not very often suited to the query.
Naturally, the higher the relevance, the higher the site will be as a result of the search, and the higher the probability that the user will go to this site. Therefore, you have a question about how to increase the relevance of search engines.
In general, it is rather difficult to deceive the modern search system, and it becomes more difficult every day. In the beginning I will say about what can not be done:
- Use keywords that are not related to the topic of the page. The keywords were invented to facilitate the indexing process. According to the plan, webmasters put into the tag, words that most fully reflected the content of their pages. Then the spam in the network was not as prevalent as it is now, and for the first time the keywords really helped to find the right information. But then, the creators of sites began to cram into this tag the most popular words that were typed when searching on the Internet, in the hope that visitors to their site will come. With this spam, most search engines have learned to fight: now the spiders began to analyze all the text on the page and compare it to the text in the keywords, and if there were no matches, then the page was not indexed. Therefore, it makes no sense to include keywords that are not on the page;
- To use any keyword many times. The spider will take it as spam and stop indexing the page;
- Place popular keywords on the page, for example: Internet, programs, computer, photo. Often the search robot simply ignores these words, because they already have tens of millions of other pages loaded;
- Use the text color equal to the background color. Previously, this method was often used by spammers. So they managed to hide the words from the visitor, but leave them visible to the spiders. Most search engines know how to deal with this. They compare the background color in the tag with the color of the text, and if the values are equal, the indexing stops. Some spammers do this: set in the tag, for example, blue, the text color is made white, then indicate the following tag: where fon.gif is a small white picture. The fact is that the browser to display the background of the page will use the file fon.gif, and make the background white, and the words will not be visible, whereas the spider color of the background will appear blue. This method has the opposite side: many users in our country can not boast of a fast connection, and therefore often disable the loading of graphics, and they will display the background color as it is indicated in the tag, i.e. in our example, blue, and the visitor will see all the words that were intended for the search robot.
Put keywords in a separate layer (Layer) and make it invisible.
- Use forwarders on the type page. This tag will redirect the visitor to http://ca1.dax.ru/ in 5 seconds after the download. Most search engines see this as spam. This example is often used by xxx-sites, putting a lot of keywords on the page, and the visitor is almost immediately sent to another resource.
Now about what needs to be done to really increase the relevance of the resource:
- Independently prescribe the keywords on each page of the site, trying to ensure that they match the topic of the page;
- Do not put commas after keywords. First, it increases the file size; secondly, most search engines read only the first 200-250 characters;
- Make the order of words according to their importance. The most important words should be at the beginning;
- It is better if the words used in the tags,,,, as well as in the ALT attribute, occur in the keywords;
- Do not repeat keywords on different pages of the site;
- Some search engines display a description of the page from the tag, and some of the first lines of the document. The description should be designed so that the user wants to go to the site. If you do not want to adapt the first lines of text on the page to the description, then you can go for the trick. Make an invisible layer, using cascading style sheets (CSS), and post it after the tag. Thus. The search engine that displays the first lines of the document will display text in an invisible layer. It is worth noting that you should not make a large description of the page, because the search engines usually display only the first 170 characters.
- Search robots are not good at spreading tables.
- On each page, use as many links to other pages of your resource as possible and as little as possible on the pages of other sites.
As I said before, the relevance score differs among different search engines. More than 90% of all requests in the world have only a couple of dozen search engines, so it makes sense to consider the technology of the most popular of them.
To date, the most popular search engine in Russia. The number of Yandex's responses to search queries exceeded 147 million. According to the company, yandex is ready to give each resident of Russia one answer per month. Search engine spider Yandex is famous for its speed. After a few hours, the indexed page becomes available in the search results.
Tips for optimizing pages for Yandex
Pages should be medium-sized, saturated with text, keywords should not be much. The words in the abbreviation ALT are valued one order lower than the keywords. The pages created by the scripts are well indexed: guest books, message boards, forums, and also pages located on the free hosting of narod.ru.
One of the youngest search engines, in the development of which was taken into account the experience of other search engines. Every day robbler robbery downloads about 2 million pages. When indexing keywords are not processed, and processing is only those that the visitor can see on the screen. As a result of the search, preference is given to sites registered in the Rambler Top100 directory.
From the moment the site is added to Aport, until the moment it appears in the search database, it takes from two to three days to two weeks. Does not index pages with the "?" Character in the address. In addition to the text that the visitor sees, Aport also indexes the title of the document (TITLE), keywords (META KEYWORDS), page descriptions (META DESCRIPTION) and picture captions (ALT). In addition, Aport performs indexing of the document's hypertext links to this document from other pages, both inside the site and outside it, as well as compiled (or verified) editors of the description of sites from the catalog.
Relevance in Google depends on:
- citation index;
- keywords in links;
- selected words.
The search robot Google is distinguished by its ability to deeply index the site, i.e. he tries to cover the maximum number of links from one page.
Features of search on AltaVista: the big role is played by presence of keywords in a tag, also a signature to pictures (ALT). A big role is played by keywords, in the first 1000 characters.
That's it. I want to note that the optimization of pages for search engines is, perhaps, the most important stage in the promotion of the site. You can deceive the search engine, but think, do you need this? After all, nothing but negative reaction of the visitor you will not achieve. A correctly optimized site will attract a much more active audience, this traffic is very high quality and highly valued, as the user comes to your site with a specific purpose and intentions.