Config file for Apache servers .htaccess

On this topic:


Файл-конфигуратор Apache-серверов .htaccess


.htaccess (with a dot at the beginning of the name) is an Apache server configuration file that allows you to configure the server in separate directories (folders) without giving access to the main configuration file. For example, set permissions for files in a directory, change the names of index files, independently handle Apache errors, redirecting visitors to special error pages. .htaccess is a plain text document whose extension is htaccess. This file is usually located at the root of the site, however you can create additional .htaccess files for different directories of your site.

Mod_rewrite is a module used by web servers to convert URLs.

Mod_rewrite module directives

  • RewriteBase
  • RewriteCond
  • RewriteEngine
  • RewriteLock
  • RewriteLog
  • RewriteLogLevel
  • RewriteMap
  • RewriteOptions
  • RewriteRule

Variants of redirect implementation using the .htaccess file

  1. Simple redirect:
      Redirect 301 / http://www.domainname.ru/
    
    or
      Redirect / secret http://www.site.ru/nosecret
    
    It is placed in the .htaccess file or httpd.conf for Apache. The first "/" means that everything from the top level of the site, including all subdirectories, will be redirected (do not forget to put the last "/"). If you want to redirect only the page, saving the PR of the old page, you can do this:

    Redirect 301 /old/old.htm http://www.you.ru/new.htm where:
    /old/old.htm - path and name of the old page
    Http://www.you.com/new.htm - new path and new name of the moved page

  2. Redirects to any page on the user's ip or when requesting a particular page (and also by the name mask).
    If the user has the ip 192.152.37.125, then it will be redirected to the page user.php:
      SetEnvIf REMOTE_ADDR 192.152.37.125 REDIR = "redir"
     RewriteCond% {REDIR} redir
     RewriteRule ^ / $ /user.php
    
  3. Redirect when requesting certain files. If you are requesting files whose extension is not specified in the .htaccess file (gif and jpg), then you should redirect:
      RewriteEngine On
     RewriteRule!. (Gif | jpg) $ index.php
    

  4. Using mod_rewrite:
      Options + FollowSymLinks 
     RewriteEngine on 
     RewriteCond% {HTTP_HOST} ^ yourdomain \ .ru 
     RewriteRule ^ (. *) $ Http://www.yourdomain.ru/$1 [R = permanent, L]
    

  5. Redirect with a regular expression:
      RedirectMatch 301 (. *) Http: //www.yourdomain.ru$1 It is prescribed in the .htaccess file. 
    
    (. *) RedirectMatch actually matches the regular expression patterns after the domain name. Therefore, you can not match the pattern to ^ / yourdomain.com. However, you can convert pages using an .html extension to files of the same name, but with a .php extension:
      RedirectMatch 301 (. *) \. Html $ http: //www.yourdomain.ru$1.php
    
    If you need to make a different redirection for individual pages, you can use the following:
      RedirectMatch Permanent ^ / html / resources.html $ http://www.newdomain.com/resources.php 
     RedirectMatch Permanent ^ / html / other_page.html $ http://www.newdomain.com/other_page.php 
     RedirectMatch Permanent ^ / (. *) $ Http://www.newdomain.com/ 
    
    " RedirectMatch Permanent " is the equivalent of "RedirectMatch 301", the string with "* (Wildcard)" should be the last in this list.

  6. Creating legible URLs
    To convert, for example, www.site.ru/product.php?id=123 to www.site.ru/product/123 as follows:
      RewriteEngine on
     RewriteRule ^ product /([^/\.]+)/?$ product.php? Id = $ 1 [L]
    
    In the following example, convert www.site.ru/script.php?product=123 to www.site.ru/cat/product/123/:
      RewriteRule cat /(.*)/(.*)/$ /script.php?$1=$2
    

  7. Redirection in PHP:
      Header ("HTTP / 1.1 301 Moved Permanently"); 
     Header ("Location: http://www.newdomain.ru/newdir/newpage.htm"); 
     Exit (); 
    
    Naturally, we need to create a page, when referring to that and will occur Redirect, and post it on the server. And better specify HTTP / 1.1 (and not HTTP / 1.0 or HTTP / 0.9, which do not support virtual hosting)

  8. Redirect all files in a folder to one file.
    For example, you no longer need the Super Disc site section and you want to redirect all requests to the / superdiscount folder to one file /hot-offers.php. To do this, add the following code to .htaccess.
      RewriteRule ^ superdiscount (. *) $ /hot-offers.php [L, R = 301]
    

  9. Redirect all folder except one file
    In the following example, all files from the / superdiscount folder will be redirected to the /hot-offers.php file, EXCEPT the /superdiscount/my-ebook.html file which should be redirected to /hot-to-make-million.html
      RewriteRule ^ superdiscount / my-ebook.html /hot-to-make-million.html [L, R = 301]
     RewriteRule ^ superdiscount (. *) $ /hot-offers.php [L, R = 301]
    

  10. Redirect dynamic URL to a new file.
    This option is useful if you want to redirect a dynamic URL with parameters to a new static file.
      RewriteRule ^ article.jsp? Id = (. *) $ /latestnews.htm [L, R = 301]
    
    That is now, a request to a file of the form http://www.kass.ws/article.jsp?id=8632 and / or http://www.kass.ws/article.jsp?id=1245 will be sent to the file http : //www.kass.ws/latestnews.htm.

  11. Mass redirection of new files.
    Now let's move on to the most difficult moment when you need to redirect a lot of URLs, for example after changing your CMS. A number of problems immediately arise. Firstly, entering all the changed addresses into an .htaccess file will take a very long time, and in itself an unpleasant occupation. Secondly, too many entries in the .htaccess file will slow down the Apache server. And thirdly, when you make this amount of information is likely that you are somewhere wrong. On this, the best way out is to hire a programmer who will write you a dynamic redirect.
    The following example is written in PHP, but it can also be executed in any language. Suppose you switched to a new link system on your site and all files ending in the old id should be spread amongst themselves. First, we create a table in the database that contains the old id and the new URL for the redirect. Old_id INT new_url VARCHAR (255) Next, write the code that will link your old id with the new URLs
    After that, add the following line to the .htaccess:
      RewriteRule ^ / product - (. *) _ ([0-9] +). Php /redirectold.php?productid=$2
    
    Then create a PHP file redirectold.php, which will support 301 redirects:
      <? Php
     Function getRedirectUrl ($ productid) {
     // Connect to the database
     $ DServer = "localhost";
     $ DDb = "mydbname";
     $ DUser = "mydb_user";
     $ DPass = "password";
    
     $ S = @mysql_connect ($ dServer, $ dUser, $ dPass)
     Or die ("Could not connect to database server");
    
     @mysql_select_db ($ dDb, $ s)
     Or die ("Could not connect to database");
    
     $ Query = "SELECT new_url FROM redirects WHERE old_id =".  $ Productid;
     Mysql_query ($ query);
     $ Result = mysql_query ($ query);
     $ HasRecords = mysql_num_rows ($ result) == 0?  False: true;
     If (! $ HasRecords) {
     $ Ret = 'http://www.yoursite.com/';
     } Else {
     While ($ row = mysql_fetch_array ($ result))
     {
     $ Ret = 'http://www.yoursite.com/'.  $ Row ["new_url"];
     }
     }
     Mysql_close ($ s);
     Return $ ret;
     }
    
     $ Productid = $ _GET ["productid"];
     $ Url = getRedirectUrl ($ productid);
    
     Header ("HTTP / 1.1 301 Moved Permanently");
     Header ("Location: $ url");
     Exit ();
     ?>
    

    Now all requests to your old URLs will be called redirectold.php, which will find the new URL and return 301 responses with your new link.

    Redirects based on time

    When you need to apply tricks of the content-dependent content type, webmasters still use CGI scripts that redirect to specific pages. How can this be done via mod_rewrite?

    There are many variables named TIME_xxx for redirect conditions. In conjunction with special lexicographic samples for comparison STRING and = STRING, we can produce time-dependent redirects:
     
      RewriteEngine on RewriteCond% {TIME_HOUR}% {TIME_MIN}> 0700 RewriteCond% {TIME_HOUR}% {TIME_MIN} <1900 RewriteRule ^ foo \ .html $ foo.day.html RewriteRule ^ foo \ .html $ foo.night.html 

    This gives the contents of foo.day.html when the URL foo.html is requested from 07:00 to 19:00 and the contents of foo.night.html are left in the remaining time.


  12. We remove all queries in the beginning "WWW."
     RewriteEngine on # announce that we want to use mod_rewrite
     RewriteCond% {HTTP_HOST} ^ www \. (. *) [NC]
     RewriteRule ^ /? (. *) Http: //% 1 / $ 1 [L, R = permanent]
    

  13. Change the .html extension to .php
    Sometimes it happens that you have a static website, and you need to have a php script working on it. To do this, you need to tell the server to process this page as a php file.
     AddHandler application / x-httpd-php .html
    
    This technique can be used for other file extensions:
     AddHandler application / x-httpd-php .xml
     AddHandler application / x-httpd-php .asp
    

Denying access to a specific directory

  1. For all to all files in the directory:
      Deny from all
    
  2. To a specific file:
     Deny from all
    
  3. By user ip:
      Order deny, allow
     Deny from all
     Allow from 192.152.37.125
    
    Access to this directory will be allowed only to the user with ip 192.152.37.125.

    And if you want to, on the contrary, prohibit individual ip users from accessing your site, then we will write the following lines:

     Order allow, deny
     Allow from all
     Deny from 192.152.37.125
     Deny from 123.456.177
    
  4. Directive Options -Indexes - prohibit the display of the contents of the directory in the absence of an index file Sometimes it is necessary to make sure that if there is no file in the directory that is shown by default, the list of files in the directory was not displayed. Then you can add this line to .htaccess:
      Options -Indexes
    
    In this case, instead of the list of files in the directory, the visitor will receive HTTP error 403 - access forbidden.
  5. Deny access to files with multiple extension types
     Deny from all
    
    It is forbidden to access files with the extension * .inc, * .conf and * .cfg. Although the directive, by default, does not work with regular expressions, but you can turn them on by putting the tilde (~) symbol in the directive options. The syntax is as follows: [tilde] [space] [hereafter_all_byes_profile] To block this access, we will write the following:
      RewriteRule ^ .htaccess $ - [F]
    
    This rule translates as follows:
    If someone tries to access the .htaccess file, the system must produce the error code 'HTTP response of 403' or '403 Forbidden - You do not have permission to access /.htaccess on this server'.

    The .htaccess $ construct in this regular expression means:
    ^ - the anchor of the beginning of the line
    $ Is the end of line anchor
    . - In regular expressions, the dot '.' It denotes a meta-symbol and must be protected by a backslash if you still want to use the actual point.

    The file name must be exactly between the start and end anchors. This will ensure that only this particular file name and no other will generate an error code.
    [F] is a special 'forbidding' flag.
    [NC] - not case sensitive.
    [OR] - means 'or the following condition'.

Definition of encoding

Determine the encoding in which the server "returns" files

  AddDefaultCharset windows-1251
Variants: KOI8-R , UTF-8 , Windows-1251

Determining the encoding for uploaded files

  CharsetSourceEnc windows-1251

Set a password for the directory using .htaccess

To set a password for the directory, you can use the basic authorization system provided in the Apache web server. We create a .htaccess file in the directory to which we want to restrict access by password, with the following directives:
  AuthType Basic
 AuthName "Some Name"
 AuthUserFile /www/some_login/www/htdocs/some_dir/.htpasswd
 Require valid-user
The path /www/some_login/www/htdocs/some_dir/.htpasswd denotes the full path to the password file on the disk of our server. If, for example, you put the .htpasswd file (there will be passwords) in the home directory where you are going by going to the server via FTP, then the path to this file will look like /www/some_login/www/htdocs/some_dir/.htpasswd , Where some_login is your login. In the AuthUserFile directive, we specify the absolute path to the file with logins / passwords, which we will create a little later. If you create a .htaccess file on your computer and not immediately on the server using a text editor, pay special attention to the fact that .htaccess must be transmitted via FTP strictly in text (ASCII) mode.

Create a password file. The password file must contain a string of the form login: password. The password must be encrypted using the MD5 algorithm. One way to create such a file is to use the program included in the Apache delivery - htpasswd (on our server it is located in the / usr / local / apache / bin directory, the full path is / usr / local / apache / bin / htpasswd).

Consider how to create a password file in the unix shell directly on the server. We go into the shell and execute the following commands:

  Htpasswd -mbc .htpasswd user1 7B1safkir
- create a new .htpasswd file, into which we add an entry for the user user1 with the password specified on the command line.
  Htpasswd .htpasswd user2
- add to the existing .htpasswd file user2, and enter the password manually in response to the corresponding program request.

After the end of the institution all logins must be uploaded to the server. About other ways to set passwords per page

Define your own error pages

You can set your own error page as follows:
  ErrorDocument 404 http://www.site.ru/404.php
IE ignores pages smaller than 512 bytes.

Indexing directories and subdirectories

To avoid indexing search engines for directories and subdirectories, you need to write this line, for example:
  DirectoryIndex index.php
This directive specifies the file to be called when accessing the directory without specifying a file name.

You can specify more than one index page. When a directory is requested, they will be searched in the order listed in the DirectoryIndex directive. If the index.html file is not found, the index.php file will be searched, etc.

  DirectoryIndex index.html index.php index.shtml

Personally, I prefer to redirect from empty directories either to the main page of the site, or to some other suitable page. For example, the directory www.site.ru/pic/ can be redirected to www.site.ru.

Protect images from downloading

Very often it happens that webmasters impudently copy content from your site along with drawings, and drawings are downloaded from your own server. This creates unnecessary traffic, which often leads to a number of problems. How can you protect yourself from such webmasters and not prevent search engines from indexing images? It's simple:
  RewriteEngine on
 RewriteCond% {HTTP_REFERER}.
 RewriteCond% {HTTP_REFERER}! ^ Http: // ([^.] + \.)? Site \.  [NC]
 RewriteCond% {HTTP_REFERER}! Google \.  [NC]
 RewriteCond% {HTTP_REFERER}! Search \? Q = cache [NC]
 RewriteCond% {HTTP_REFERER}! Msn \.  [NC]
 RewriteCond% {HTTP_REFERER}! Yahoo \.  [NC]
 RewriteCond% {REQUEST_URI}! ^ / Hotlinker \ .gif $
 RewriteRule \. (Gif | jpg | png) $ /hotlinker.gif [NC, L]
Hotlinker.gif - the image that will be displayed, instead of the true images. I recommend in this image to display your logo and link to your site.

Another variation of the prohibition of access to pictures from unauthorized sites:

  SetEnvIfNoCase Referer "^ $" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Htmlweb \ .ru" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Images \ .yandex \ .ru" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Hghltd \ .yandex \ .com" local_ref = 1
 Order Allow, Deny
  Allow from env = local_ref

Search engines and all sorts of scanners create collosal traffic on your site. The following code block will prevent access to the site.

  RewriteCond% {HTTP_USER_AGENT} (Googlebot | Slurp | spider | Twiceler | heritrix |
	 Combine | appie | boitho | e-SocietyRobot | Exabot | Nutch | OmniExplorer |
	 MJ12bot | ZyBorg / 1 | Ask \ Jeeves | AskJeeves | ActiveTouristBot |
	 JemmaTheTourist |  Agadine3 | BecomeBot | Clustered-Search-Bot |
	 MSIECrawler | freefind | galaxy | genieknows | INGRID | grub-client |
	 MojeekBot | NaverBot | NetNose-Crawler | OnetSzukaj | PrassoSunner |
	 Asterias \ Crawler | THUNDERSTONE | GeorgeTheTouristBot |
	 VoilaBot | Vagabondo | fantomBro wser | stealthBrowser | cloakBrowser |
	 FantomCrew \ Browser | Girafabot | Indy \ Library | Intelliseek | Zealbot |
	 Windows \ 95 | ^ Mozilla / 4 \ .05 \ \ [en \] $ | ^ Mozilla / 4 \ .0 $) [NC]
 RewriteRule ^ (. *) $ - [F]
 #
 RewriteCond% {HTTP_USER_AGENT} ^ Mozilla. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Opera. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Firefox. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Netscape. * [NC]
 RewriteRule ^ (. *) $ - [L]
 RewriteRule ^ (. *) $ - [F]

Tracking calls to your robots.txt file

To have more information about visiting search engines, it's useful to have detailed information about accessing the robots.txt file. In order to formalize this, in '.htaccess' there should be the following entries:
  RewriteEngine on
  Options + FollowSymlinks
  RewriteBase /
  RewriteRule ^ robots.txt $ /robot.php?%{REQUEST_URI}
Now when requesting the 'robots.txt' file, our RewriteRule redirects the visitor (robot) to the robot.php requesting script. In addition, the variable is passed to the script, which will be processed according to your needs. 'REQUEST_URI' specifies the name of the requested file. In this example, it's 'robots.txt'. The script will read the contents of 'robots.txt' and send it to the web browser or search engine robot. Thus, we can count hits of visitors and log files.

PHPSESSID

To disable the addition of PHPSESSID to the URL, put in the beginning index.php:

  Ini_set ("session.use_trans_sid", 0);

Either in .htaccess, write:

  Php_flag session.use_trans_sid Off

If you find this all difficult, use the ready Service of converting dynamic URLs to static URLs using htaccess

Cache directives

Caching for all file types by access time
  ExpiresActive on
 ExpiresDefault "access plus 600 seconds"
Caching for all file types by time of change
  ExpiresActive on
 ExpiresDefault "modification plus 600 seconds"
Caching for certain file types
  ExpiresByType text / css "modification plus 600 seconds"
 ExpiresByType image / jpeg "modification plus 600 seconds"
 ExpiresByType image / gif "modification plus 600 seconds"
 ExpiresByType image / x-ico "modification plus 600 seconds"
 ExpiresByType image / png "modification plus 600 seconds"

Disable caching with the Apache server

Open the Apache server configuration file httpd.conf and uncomment the following lines:

 LoadModule expires_module modules / mod_expires.so
 LoadModule headers_module modules / mod_headers.so 
 ...
 AddModule mod_expires.c
 AddModule mod_headers.c

Write in .htaccess the following:

  # Disable caching in this folder
 # You need to enable modules
 # Mod_headers.c and mod_expires.c
 #
 # Header Cache-Control
 Header append Cache-Control "no-store, no-cache, must-revalidate"
 # Expires headline
 ExpiresActive On
 ExpiresDefault "now"

The necessary headers will be transferred automatically, and you do not need to write them specially in PHP - the cache is already off!

Cache-control http-header description

Caching with the .htaccess file

  # Cache resolution in this folder
 # You need to enable modules
 # Mod_headers.c and mod_expires.c
 #
 # Header Cache-Control
 Header append Cache-Control "public"
 # Expires headline
 ExpiresActive On
  ExpiresDefault "access plus 1 hours"
  #ExpiresDefault "access plus 10 years"

Caching javascript files using the .htaccess file

 ExpiresDefault "access plus 3 days"

Be careful when caching, because When changing a file, the user can get a new version only after 3 days!

How do I get html pages to handle php code?

Write the following lines in your .htaccess file:
  RemoveHandler .php .htm .html
 AddHandler application / x-httpd-php .php .htm .html

How to place several sites on one virtual hosting?

To place two or more sites on one virtual hosting, in spite of the number of domains assigned to you, you need to add the following lines in the file ".htaccess":

  RewriteEngine On
 RewriteRule ^ newdirectory / - [L]
 RewriteCond% {HTTP_HOST} (www.)? Newdomain.ru [NC]
 RewriteRule (. *) Newdirectory / $ 1 [L]

Where:
Newdirectory / - the folder where the second site will be located
Newdomain.ru - the domain for which we are redirecting

Please note that you will have a single mail account. Those. If, you have a box [email protected], then after connecting the domain newdomain.ru at the box [email protected] a second name appears - [email protected]. And when you create any new mailbox (for example, info), it is automatically assigned two names - [email protected] and [email protected].

Search for pages in more than one directory

Sometimes it is necessary to let the web server search for pages in more than one directory.

  RewriteEngine on

 # First, try to find it in the specified location / ...
 # ... and if you find that we finish the search:
 RewriteCond / your / docroot / dir1 /% {REQUEST_FILENAME} -f
 RewriteRule ^ (. +) / Your / docroot / dir1 / $ 1 [L]

 # Secondly - try to find it in pub / ...
 # ... and if you find that we finish the search:
 RewriteCond / your / docroot / dir2 /% {REQUEST_FILENAME} -f
 RewriteRule ^ (. +) / Your / docroot / dir2 / $ 1 [L]

 # Otherwise continue for other directives
 RewriteRule ^ (. +) - [PT]

Virtual user hosts

If you want to provide addresses to www.subdomain.domain.com for user pages, you can use the following set of rules to convert http://www.subdomain.domain.com/path to the internal path / home / subdomain / path:

  RewriteEngine on
 RewriteCond% {HTTP_HOST} ^ www \. [^.] + \. Ru $
 RewriteRule ^ (. +)% {HTTP_HOST} $ 1 [C]
 RewriteRule ^ www \. ([^.] +) \. Ru (. *) / Home / $ 1 $ 2

Files are damaged when they are uploaded to the server

If the binary data is damaged during the transfer of files through the forms (with the specified enctype = "multipart / form-data"), write the directive in /cgi-bin/.htaccess:
  CharsetRecodeMultipartForms Off.

Error loading SWF files.
Errors in accessing pages containing keywords,
Type $ _REQUEST

This can happen because of the installed module In Apache. By default, it blocks in the requests rows with SQL arguments and other potentially dangerous commands.

Possible error messages:

Forbidden

You do not have permission to access /adm/index.php on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
or
The request is unsafe and was rejected.
Add to .htaccess
 SecFilterEngine Off
 SecFilterScanPOST Off
For the message:
"POST /wp-admin/async-upload.php HTTP / 1.1" 406 354 "-" "Shockwave Flash"
You can unprotect only the download of files to the server:
 SecFilterEngine Off
 SecFilterScanPOST Off

Optimal to remove protection only from the folder in which it is necessary, without removing protection from the entire site.

Server Variables

These are variables of the form % {NAME_OF_VARIABLE}

Where NAME_OF_VARIABLE can be a string from the following list:

HTTP headers: Connection & request:
HTTP_USER_AGENT
HTTP_REFERER
HTTP_COOKIE
HTTP_FORWARDED
HTTP_HOST
HTTP_PROXY_CONNECTION
HTTP_ACCEPT
REMOTE_ADDR
REMOTE_HOST
REMOTE_USER
REMOTE_IDENT
REQUEST_METHOD
SCRIPT_FILENAME
PATH_INFO
QUERY_STRING
AUTH_TYPE
Internal servers: System: Special:
DOCUMENT_ROOT
SERVER_ADMIN
SERVER_NAME
SERVER_ADDR
SERVER_PORT
SERVER_PROTOCOL
SERVER_SOFTWARE
TIME_YEAR
TIME_MON
TIME_DAY
TIME_HOUR
TIME_MIN
TIME_SEC
TIME_WDAY
TIME
API_VERSION
THE_REQUEST
REQUEST_URI
REQUEST_FILENAME
IS_SUBREQ

These variables completely correspond to the similarly named MIME HTTP headers, and to Apache server variables or to the struct tm fields of Unix systems. Those that are special for mod_rewrite include:

IS_SUBREQ - Will contain the text "true" if the query is currently running as a subquery, "false" otherwise. Subqueries can be generated by modules that need to deal with additional files or a URI in order to perform their own tasks.

API_VERSION - This is the Apache API version (the internal interface between the server and the module) in the current server assembly, as defined in include / ap_mmn.h. The API version of the module corresponds to the version of Apache (for Apache version 1.3.14, for example, it is 19990320: 10), but this is mostly interesting for the authors of the modules.

THE_REQUEST - The entire HTTP request string sent by the browser to the server (ie, " GET /index.html HTTP / 1.1 "). It does not include any additional headers sent by the browser.

REQUEST_URI - The resource requested in the HTTP request string.

REQUEST_FILENAME - Full path in the server's file system to a file or script corresponding to this query.

Notes:

  1. The variables SCRIPT_FILENAME and REQUEST_FILENAME contain the same values, i.e., the value of the filename field of the internal request_rec structure of the Apache server. The first name is simply the widely known name of the CGI variable, while the second is a permanent copy of REQUEST_URI (containing the value of the uri field of the request_rec structure ).
  2. There is a special format: % {ENV: variable} where the variable can be any environment variable. This is looked for in internal Apache structures and (if not there) by calling getenv () from the Apache server process.
  3. There is a special format: % {HTTP: header} where the header can be any HTTP MIME header name. This is looked up in the HTTP request. Example: % {HTTP: Proxy-Connection} the value of the HTTP header " Proxy-Connection: ".
  4. There is a special format for % {LA-U: variable} of the leading queries that are produced by an internal (URL-based) subquery to determine the final value of the variable . Use this when you want to use a variable for transformations that is actually defined later in any phase of the API, and thus is not available at this stage. For example, when you want to convert the variable REMOTE_USER from the context of the server ( httpd.conf file), you should use % {LA-U: REMOTE_USER} because this variable is set in authorization phases that go after the URL translation phase in which mod_rewrite works. On the other hand, due to the implementation of mod_rewrite in the context of the directory (.htaccess file) through the fixup API phase and because the authorization phases go before this phase, you can just use % {REMOTE_USER} .
  5. There is a special format: % {LA-F: variable} that creates an internal (file-based) subquery to determine the final value of the variable . Basically this is the same as the LA-U format given above.

Home page without duplication

Usually the main page code is physically located in the file index.html or index.php, but the site should open on any of the requests: yoursite.ru, yoursite.ru/index.html, www.yoursite.ru and www.yoursite.ru/index. .html. But for search engines it's four different URLs! If you do not configure .htaccess correctly, the search engine will add four identical pages to your index. This is a sign of a substandard website. You can avoid this problem by using this code in .htaccess:

  Options + FollowSymLinks
 RewriteEngine on
 RewriteCond% {HTTP_HOST} ^ yoursite.ru
 RewriteRule (. *) Http://www.yoursite.ru/$1 [R = 301, L]
 RewriteCond% {THE_REQUEST} ^ [AZ] {3,9} \ / index \ .html \ HTTP /
 RewriteRule ^ index \ .html $ http://www.yoursite.ru/ [R = 301, L]

All duplicate pages will be glued together with a redirect code 301 with the main page - http://www.yoursite.ru/.

Duplicate pages without a slash at the end of the URL

To prevent the situation with the indexing of the pages www.yoursite.ru/about and www.yoursite.ru/about/ as different, we put the following code:

From the pages without a slash, the redirect will be set to "slash".

  RewriteCond% {REQUEST_FILENAME}! -f
 RewriteCond% {REQUEST_URI}! (. *) / $
 RewriteRule ^ (. *) $ / $ 1 / [R = 301, L]

Saving files instead of opening

Many have seen how, when trying to download an archive with a .rar extension, the browser opens it as a simple text from a hash of characters. This means that the server site is not configured to force the saving of file types, which should not open in the browser.

  AddType application / octet-stream .rar .doc .mov .avi .pdf .xls .mp4