This page has been robot translated, sorry for typos if any. Original content here.

Cheating Search Engines or Increasing Website Relevance

Introduction

Principle of operation

Search engine cheating

Increase relevance

Conclusion

Introduction

Today, finding the right information on the Internet is becoming more and more difficult. The reason for this is the huge increase in the number of sites and even greater growth in information junk, various advertising. How often did you have to make a request in your favorite search engine to find the last song of your favorite artist or the best xxx site? And how often have you observed that the search result is completely unrelated to the query? Instead of a new song, you are offered to enter the financial pyramid, and instead of the best xxx site, order a girl at extremely low prices with home delivery. All this is the result of cheating search robots. The methods of cheating and increasing the site’s ranking in search engines are described in this article.

Principle of operation

Here is a typical example of an ordinary search engine. There is a software module (spider) that wanders through links, reads the contents of pages, and for each word makes an entry in the index file. For example, for the word "freebie", something like this will be created in the index file: "freebie1". Then, in the file where the links are stored, the entry “1 page URL” will be made. Explanation: 1 is the number that links the entries in the index file (table) and the link file. Then the spider will crawl to another page and stumble there again on the word "freebie." Now in the index table he will create an entry: "freebie12", and in the link table: "2 page URLs". When the user types the word “freebie” in the search bar, the search engine will look at the index file, find the line “freebie” there, read the numbers 12 and find the addresses corresponding to the numbers 1 and 2 in the link table and give them to the user. Here is the basic principle of the search engines, which is called indexing. What then determines the position of sites in the search? Answer: from relevance, i.e. from the correspondence of the document to the user's request. What does relevance depend on? In general, relevance assessment algorithms differ among different search engines, and are kept in the strictest confidence. Here are the main options:

  • The number of repeated words in the document.
  • Keywords enclosed in tags,,,,. Those. if the page is associated with a freebie, it is better to write the word "freebie" between the tags,, and in the further text highlight this word.
  • The distance between the keywords in the document. The smaller the distance, the higher the relevance.
  • Citation Index - a value indicating the number of links from other resources to this site. The more sites link to this resource, the higher the citation index. The popularity of the site from which the link also matters.
  • An equally important parameter: the thickness of the wallet of the owner of the resource. Search engines are made by people who also want to eat, drink beer, buy the Hacker magazine. And they show ads directly in the search results. Paid links that appear in the top lines of the search result do not very often turn out to be relevant to the request.

Naturally, the higher the relevance, the higher the site will be as a result of the search, and the higher the likelihood that the user will visit this site. Therefore, you have a question about how to increase the relevance of search engines.

Search engine cheating

In general, it’s quite difficult to deceive the modern search engine, and every day it becomes more and more difficult. In the beginning I’ll say about what you can’t do:

  • Use keywords that are not related to the subject of the page. Keywords were invented to facilitate the indexing process. By design, webmasters placed in the tag the words that most fully reflected the content of their pages. Then spam on the network was not as widespread as it is now, and at first the keywords really helped to find the necessary information. But then, the creators of sites began to push into this tag the most popular words that were typed when searching on the Internet, in the hope that visitors would come to their site. Most search engines learned to deal with such spam: now spiders began to analyze all the text on the page and compare it with the text in the keywords, and if there was no match, the page was not indexed. Therefore, it makes no sense to include keywords that are not on the page;
  • Use a keyword many times. The spider will take this as spam and stop indexing the page;
  • Place popular keywords on the page, for example: Internet, programs, computer, photo. Often, the search engine simply ignores these words, since they have already loaded tens of millions of other pages;
  • Use text color equal to background color. Previously, spammers often used this method. Thus they managed to hide the words from the visitor, but leave them visible to the spiders. Most search engines can deal with this. They compare the background color in the tag with the color of the text, and if the values ​​are equal, then the indexing stops. Some spammers do this: they specify in the tag, for example, blue, the text color is white, then the following tag is indicated: where fon.gif is a small picture in white. The fact is that the browser will use the fon.gif file to display the background of the page, and will make the background white and no words will be visible, while the spider will show the background color blue. This method has a flip side: many users in our country cannot boast of a quick connection, and therefore often disable the loading of graphics, and their browser will display the background color as it is indicated in the tag, i.e. in our example, in blue, and the visitor will see all the words that were intended for the search robot.
    Place keywords in a separate layer (Layer) and make it invisible.
  • Use redirectors on the type page. This tag will redirect the visitor to the site http://ca1.dax.ru/ 5 seconds after loading. Most search engines perceive this as spam. This example is often used by xxx sites, placing many keywords on the page, and the visitor is almost immediately sent to another resource.

Increase relevance

Now about what needs to be done to really increase the relevance of a resource:

  • Independently register keywords on each page of the site, trying to maximize their relevance to the subject of the page;
  • Do not put commas after keywords. Firstly, it increases the file size; secondly, most search engines read only the first 200-250 characters;
  • Prioritize words according to their importance. The most important words should come first;
  • It is better if the words used in the tags,,,, and also in the ALT attribute are found in keywords;
  • Do not repeat keywords on different pages of the site;
  • Some search engines display a page description from a tag, and some of the first lines of a document. The description should be made so that the user wants to go to the site. If you do not want to adapt the first lines of text on the page to the description, then you can go for a trick. Make an invisible layer using cascading table styles (CSS) and place it after the tag. T.O. a search engine that displays the first lines of a document will display text in an invisible layer. It is worth noting that you should not make a large description of the page, as search engines usually display only the first 170 characters.
  • Search robots do not treat tables well.
  • On each page, use as many links as possible to other pages of your resource and as little as possible to pages of other sites.

As I said before, relevance scores vary across search engines. More than 90% of all requests in the world account for only a couple of dozen search engines, so it makes sense to consider the technology of the most popular of them.

Yandex

Today, the most popular search engine in Russia. The number of Yandex responses to search queries exceeded 147 million. According to the company, yandex is ready to give each Russian citizen one answer per month. Yandex search spider is famous for its speed of work. After just a few hours, the indexed page becomes available in the search results.

Page optimization tips for Yandex

Pages should be medium in size, rich in text, there should not be many keywords. Words in the abbreviation ALT are valued an order of magnitude lower than keywords. Pages created by scripts are well indexed: guest books, message boards, forums, as well as pages located on the free hosting narod.ru.

Rambler

One of the youngest search engines, the development of which took into account the experience of other search engines. The rambler robot downloads about 2 million pages daily. When indexing, keywords are not processed, and only those that the visitor can see on the screen are processed. As a result of the search, preference is given to sites registered in the Rambler Top100 catalog.

Aport

From the moment a site is added to Aport until it appears in the search base, it takes from two to three days to two weeks. It does not index pages in the address of which the symbol “?” Is found. In addition to the text that the visitor sees, Aport indexes the document title (TITLE), keywords (META KEYWORDS), page descriptions (META DESCRIPTION) and image captions (ALT). In addition, Aport indexes how hypertext links to this document belong to the document from other pages located both inside the site and outside it, as well as website descriptions compiled (or verified) by editors from the catalog.

Google

Google relevance depends on:

  • citation index;
  • keywords
  • keywords in links;
  • highlighted words.

Google search robot is distinguished by its ability to deeply index the site, i.e. he tries to cover the maximum number of links from one page.

Altavista

Search features on AltaVista: the presence of keywords in the tag, as well as image captions (ALT) play a large role. Keywords play an important role in the first 1000 characters.

Conclusion

OK it's all over Now. I want to note that page optimization for search engines is perhaps the most important stage in website promotion. You can fool a search engine, but think about whether you need it? After all, you will not achieve anything but the negative reaction of the visitor. A properly optimized site will attract a much more active audience, such traffic is very high quality and is highly valued, since the user comes to your site with a specific purpose and intentions.