Cheating search engines or increasing the relevance of the site
Today, finding the right information on the Internet is becoming more and more difficult. The reason for this is a huge increase in the number of sites and an even greater increase in informational garbage, various advertisements. How often have you made a request in your favorite search engine to search for the last song of your favorite artist or the best xxx site? And how often have you observed that the search result is not related to the query at all? Instead of a new song, you are offered to enter the financial pyramid, and instead of the best xxx site, order a girl at extremely low prices with home delivery. All this is the result of deception of search robots. On the methods of cheating and improving the ranking of the site in search engines described in this article.
Here is a typical example of an ordinary search engine. There is a software module (spider, spider) that wanders through links, reads the contents of the pages, and for each word makes an entry in the index file. For example, for the word “freebie”, something like this would be created in the index file: “freebie1”. Then in the file where the links are stored will be recorded "1 URL of the page." Explanation: 1 is the number that links the records in the index file (table) and the link file. Then the spider will crawl to another page and stumble there again on the word "freebie." Now in the index table, he will create an entry: “freebie12”, and in the reference table: “2 URL of the page”. When the user types the word “freebie” in the search box, the search engine will look at the index file, find the “freebie” line there, read the numbers 12 and find the addresses in the reference table corresponding to numbers 1 and 2 and give them to the user. Here is the basic principle of the search engines, which is called indexing. What then determines the position of sites in the search result? Answer: by relevance, i.e. from the document compliance with the user's request. What does relevance depend on? In general, relevance assessment algorithms are different for different search engines, and are kept in the strictest confidence. Here are the basic parameters:
- The number of duplicate words in a document.
- Keywords enclosed in tags,,,,. Those. if the page is associated with a freebie, then it is better to write the word “freebie” between the tags,, and in the following text to highlight this word.
- The distance between the keywords in the document. The smaller the distance, the higher the relevance.
- The citation index is a value indicating the number of links from other resources to this site. The more sites that link to this resource, the greater the citation index. It matters and the popularity of the site from which the link goes.
- No less important parameter: the thickness of the resource owner's wallet. Search engines make people who also want to eat, drink beer, buy a magazine "Hacker". And they show ads directly in the search results. The paid links shown in the top lines of the search result are not very often relevant to the query.
Naturally, the higher the relevance, the higher the site will be as a result of the search, and the higher the likelihood that the user will go to this particular site. Consequently, you have a question about how to increase the relevance of search engines.
In general, it is quite difficult to deceive a modern search engine, and every day it becomes more and more difficult. In the beginning I will say what can not be done:
- Use keywords not related to the subject of the page. Keywords have been devised to facilitate the indexing process. According to the plan, webmasters put in a tag, words that most fully reflected the content of their pages. Then spam in the network has not yet been so widespread as it is now, and at first the keywords really helped to find the necessary information. But then, the creators of sites began to cram the most popular words into this tag, which were typed when searching on the Internet, in the hope that visitors would come to their site. Most of the search engines have learned how to fight with such spam: now the spiders began to analyze all the text on the page and compare it with the text in the keywords, and if there was no match, the page was not indexed. Therefore, it makes no sense to include keywords that are not on the page;
- Use a keyword many times. The spider will take this as spam and stop indexing the page;
- Place on the page popular keywords, for example: Internet, software, computer, photo. Often, the search robot simply ignores these words, since they have already loaded tens of millions of other pages;
- Use text color equal to background color. Previously, this method was often used by spammers. Thus, they managed to hide the words from the visitor, but left them visible to the spiders. Most search engines can deal with this. They compare the background color in the tag with the text color, and if the values are equal, then the indexing stops. Some spammers do this: set in the tag, for example, blue, make the text color white, then indicate the following tag:, where fon.gif is a small drawing in white. The fact is that the browser will use the fon.gif file to display the background of the page, and will make the background white, and no words will be visible, whereas the background color of the spider will appear blue. This method has a downside: many users in our country cannot boast of a fast connection, and therefore often disable the download of graphics, and their browser will display the background color as it appears in the tag, i.e. in our example, blue, and the visitor will see all the words that were intended for the search robot.
Place keywords in a separate layer (Layer) and make it invisible.
- Use forwarders on type page. This tag will redirect the visitor to the site http://ca1.dax.ru/ in 5 seconds after the download. Most search engines perceive this as spam. This example is often used by xxx sites, placing many keywords on a page, and the visitor is almost immediately sent to another resource.
Now about what needs to be done to really increase the relevance of the resource:
- To independently register the keywords on each page of the site, trying to ensure that they correspond to the theme of the page as closely as possible;
- Do not put commas after keywords. First, it increases the file size, and secondly, most search engines read only the first 200-250 characters;
- Make a sequence of words according to their importance. The most important words should be at the beginning;
- It is better if the words used in the tags,,,, and also in the ALT attribute will appear in the keywords;
- Do not repeat keywords on different pages of the site;
- Some search engines display the description of the page from the tag, and some of the first lines of the document. The description should be made so that the user wants to go to the site. If you do not want to adapt the first lines of text on the page to the description, then you can go for a trick. Make an invisible layer using Cascading Style Sheets (CSS) and place it after the tag. So The search engine that displays the first lines of the document will display the text in the invisible layer. It is worth noting that you should not make a great description of the page, as search engines usually display only the first 170 characters.
- Search robots treat tables badly.
- On each page, use as many links to other pages of your resource as possible and as little as possible to pages of other sites.
As I said, the relevance score varies from one search engine to another. More than 90% of all requests in the world account for only a couple of dozens of search engines, so it makes sense to consider the technology of the most popular ones.
To date, the most popular search engine in Russia. The number of Yandex responses to search queries exceeded 147 million. According to the company, yandex is ready to give each resident of Russia one response per month. Yandex search spider is famous for its speed of work. After several hours, the indexed page becomes available in the search results.
Tips on optimizing pages for Yandex
Pages should be medium in size, rich in text, keywords should not be much. The words in the abbreviation ALT are valued an order of magnitude lower than keywords. Pages created by scripts are well indexed: guest books, message boards, forums, as well as pages hosted on free hosting narod.ru.
One of the youngest search engines, the development of which was taken into account the experience of other search engines. Daily Rabbler robot downloads about 2 million pages. When indexing, keywords are not processed, and only those that a visitor can see on the screen are processed. As a result of the search, preference is given to sites registered in the Rambler Top100 catalog.
From the moment the site is added to the Aport until its appearance in the search database, it takes from two to three days to two weeks. It does not index pages in whose address the “?” Character is found. In addition to the text that the visitor sees, Aport also indexes the document title (TITLE), keywords (META KEYWORDS), page descriptions (META DESCRIPTION), and image captions (ALT). In addition, Aport indexes both the document-related texts of hyperlinks to this document from other pages both inside the site and outside it, as well as compiled (or verified) editors of the description of sites from the catalog.
The relevance of Google depends on:
- citation index;
- keywords in links;
- highlighted words.
The Google search robot is notable for its ability to deeply index a site, i.e. He tries to cover the maximum number of links from one page.
Features of the search on AltaVista: a large role is played by the presence of keywords in the tag, as well as captions to pictures (ALT). Important role played by keywords in the first 1000 characters.
Well that's all. I want to note that the optimization of pages for search engines is probably the most important stage in the promotion of a site. You can deceive the search engine, but think about whether you need it? After all, nothing but the negative reaction of the visitor you will not achieve. A properly optimized site will attract a much more active audience, such traffic is very high quality and highly valued, because the user comes to your site with a specific goal and intention.