This page has been robot translated, sorry for typos if any. Original content here.

Config file for Apache servers .htaccess

On this topic:

Файл-конфигуратор Apache-серверов .htaccess

.htaccess (with a dot at the beginning of the name) is an Apache server configuration file that allows you to configure the server in separate directories (folders) without giving access to the main configuration file. For example, set permissions for files in a directory, change the names of index files, independently handle Apache errors, redirecting visitors to special error pages. .htaccess is a plain text document whose extension is htaccess. This file is usually located in the root of the site, however you can create additional .htaccess files for different directories of your site.

Mod_rewrite is a module used by web servers to convert URLs.

Mod_rewrite module directives

  • RewriteBase
  • RewriteCond
  • RewriteEngine
  • RewriteLock
  • RewriteLog
  • RewriteLogLevel
  • RewriteMap
  • RewriteOptions
  • RewriteRule

Variants of Redirect implementation using the .htaccess file

  1. Simple redirect:
      Redirect 301 /
      redirect / secret
    It is placed in the .htaccess file or httpd.conf for Apache. The first "/" means that everything from the top level of the site, including all subdirectories, will be redirected (do not forget to put the last "/"). If you want to redirect only the page, saving the PR of the old page, you can do this:

    Redirect 301 /old/old.htm where:
    /old/old.htm - path and name of the old page - new path and new name of the moved page

  2. Redirects to any page on the user's ip or when requesting a particular page (and also by the name mask).
    If the user has the ip, then it will be redirected to the page user.php:
      SetEnvIf REMOTE_ADDR REDIR = "redir"
     RewriteCond% {REDIR} redir
     RewriteRule ^ / $ /user.php
  3. Redirect when requesting certain files. If you are requesting files whose extension is not specified in the .htaccess file (gif and jpg), then you should redirect:
      RewriteEngine On
     RewriteRule!. (Gif | jpg) $ index.php

  4. Using mod_rewrite:
      Options + FollowSymLinks 
     RewriteEngine on 
     RewriteCond% {HTTP_HOST} ^ yourdomain \ .ru 
     RewriteRule ^ (. *) $ Http://$1 [R = permanent, L]

  5. Redirect with a regular expression:
      RedirectMatch 301 (. *) Http: //$1 It is prescribed in the .htaccess file. 
    (. *) RedirectMatch actually matches the regular expression patterns after the domain name. Thus, you can not match the pattern to ^ / However, you can convert pages using an .html extension to files of the same name, but with a .php extension:
      RedirectMatch 301 (. *) \. Html $ http: //$1.php
    If you need to do a different redirection for individual pages, you can use the following:
      RedirectMatch Permanent ^ / html / resources.html $ 
     RedirectMatch Permanent ^ / html / other_page.html $ 
     RedirectMatch Permanent ^ / (. *) $ Http:// 
    " RedirectMatch Permanent " is the equivalent of "RedirectMatch 301", the string with "* (Wildcard)" should be the last in this list.

  6. Creating legible URLs
    To convert, for example, to as follows:
      RewriteEngine on
     RewriteRule ^ product /([^/\.]+)/?$ product.php? Id = $ 1 [L]
    In the following example, convert to
      RewriteRule cat /(.*)/(.*)/$ /script.php?$1=$2

  7. Redirect to PHP:
      header ("HTTP / 1.1 301 Moved Permanently"); 
     header ("Location:"); 
     exit (); 
    Naturally, we need to create a page, when referring to that and will occur Redirect, and post it on the server. And it is better to specify HTTP / 1.1 (and not HTTP / 1.0 or HTTP / 0.9, which do not support virtual hosting)

  8. Redirect all files in a folder to one file.
    For example, you no longer need the Super Disc site section and want to redirect all requests to the / superdiscount folder to one file /hot-offers.php. To do this, add the following code to .htaccess.
      RewriteRule ^ superdiscount (. *) $ /Hot-offers.php [L, R = 301]

  9. Redirect all folder except one file
    In the following example, all files from the / superdiscount folder will be redirected to the /hot-offers.php file, EXCEPT the /superdiscount/my-ebook.html file which should be redirected to /hot-to-make-million.html
      RewriteRule ^ superdiscount / my-ebook.html /hot-to-make-million.html [L, R = 301]
     RewriteRule ^ superdiscount (. *) $ /Hot-offers.php [L, R = 301]

  10. Redirect dynamic URL to a new file.
    This option is useful if you want to redirect a dynamic URL with parameters to a new static file.
      RewriteRule ^ article.jsp? Id = (. *) $ /Latestnews.htm [L, R = 301]
    That is now, a request to a file of the form and / or will be sent to the http file : //

  11. Mass redirection of new files.
    Now let's move on to the most difficult moment when you need to redirect a lot of URLs, for example after changing your CMS. A number of problems immediately arise. Firstly, entering all the changed addresses into an .htaccess file will take a very long time, and in itself an unpleasant occupation. Secondly, too many entries in the .htaccess file will slow down the Apache server. And thirdly, when you enter this amount of information, it is likely that you will be mistaken somewhere. On this, the best way out is to hire a programmer who will write you a dynamic redirect.
    The following example is written in PHP, but it can also be executed in any language. Suppose you switched to a new link system on your site and all files ending in the old id should be spread among. First, we create a table in the database that contains the old id and the new URL for the redirect. old_id INT new_url VARCHAR (255) Next, write the code that will link your old id with the new URLs
    After that, add the following line in .htaccess:
      RewriteRule ^ / product - (. *) _ ([0-9] +). Php /redirectold.php?productid=$2
    then create a PHP file redirectold.php, which will support 301 redirects:
      <? php
     function getRedirectUrl ($ productid) {
     // Connect to the database
     $ dServer = "localhost";
     $ dDb = "mydbname";
     $ dUser = "mydb_user";
     $ dPass = "password";
     $ s = @mysql_connect ($ dServer, $ dUser, $ dPass)
     or die ("Could not connect to database server");
     @mysql_select_db ($ dDb, $ s)
     or die ("Could not connect to database");
     $ query = "SELECT new_url FROM redirects WHERE old_id =".  $ productid;
     mysql_query ($ query);
     $ result = mysql_query ($ query);
     $ hasRecords = mysql_num_rows ($ result) == 0?  false: true;
     if (! $ hasRecords) {
     $ ret = '';
     } else {
     while ($ row = mysql_fetch_array ($ result))
     $ ret = ''.  $ row ["new_url"];
     mysql_close ($ s);
     return $ ret;
     $ productid = $ _GET ["productid"];
     $ url = getRedirectUrl ($ productid);
     header ("HTTP / 1.1 301 Moved Permanently");
     header ("Location: $ url");
     exit ();

    Now all calls to your old URLs will be called redirectold.php, which will find the new URL and return 301 responses with your new link.

    Redirects based on time

    When you need to use tricks of the content-based content type, the weight of webmasters still use CGI scripts that generate redirects to special pages. How can this be done via mod_rewrite?

    There are many variables named TIME_xxx for redirect conditions. In conjunction with special lexicographic samples for comparison STRING and = STRING, we can produce time-dependent redirects:
      RewriteEngine on RewriteCond% {TIME_HOUR}% {TIME_MIN}> 0700 RewriteCond% {TIME_HOUR}% {TIME_MIN} <1900 RewriteRule ^ foo \ .html $ RewriteRule ^ foo \ .html $ foo.night.html 

    This gives the contents of when the URL foo.html is requested from 07:00 to 19:00 and in the remaining time the contents of foo.night.html.

  12. We remove all requests in the beginning "WWW."
     RewriteEngine on # announce that we want to use mod_rewrite
     RewriteCond% {HTTP_HOST} ^ www \. (. *) [NC]
     RewriteRule ^ /? (. *) Http: //% 1 / $ 1 [L, R = permanent]

  13. Change the .html extension to .php
    Sometimes it happens that you have a static website, and you need to have a php-script work on it. To do this, you need to tell the server to process this page as a php file.
     AddHandler application / x-httpd-php .html
    This technique can be used for other file extensions:
     AddHandler application / x-httpd-php .xml
     AddHandler application / x-httpd-php .asp

Denying access to a specific directory

  1. for all to all files in the directory:
      deny from all
  2. to a specific file:
     deny from all
  3. by user ip:
      order deny, allow
     deny from all
     allow from
    Access to this directory will be allowed only to the user with ip

    And if you want to disallow individual IP users access to your site, then we will write the following lines:

     order allow, deny
     allow from all
     deny from
     deny from 123.456.177
  4. Directive Options -Indexes - prohibit the display of the contents of the directory in the absence of an index file Sometimes you need to make sure that if there is no file in the directory that is shown by default, the list of files in the directory was not displayed. Then you can add this line to .htaccess:
      Options -Indexes
    In this case, instead of the list of files in the directory, the visitor will receive HTTP error 403 - access forbidden.
  5. Deny access to files with multiple extension types
     deny from all
    It is forbidden to access files with the extension * .inc, * .conf and * .cfg. Although the directive, by default, does not work with regular expressions, but you can turn them on by putting the tilde (~) symbol in the directive options. The syntax is as follows: [tilde] [space] [hereafter_all_uneless_block] To block this access, we will write the following:
      RewriteRule ^ .htaccess $ - [F]
    This rule translates as follows:
    If someone tries to access the .htaccess file, the system should produce the error code 'HTTP response of 403' or '403 Forbidden - You do not have permission to access /.htaccess on this server'.

    The .htaccess $ construct in this regular expression means:
    ^ - beginning line anchor
    $ is the end of line anchor
    . - in regular expressions, the dot '.' It denotes a meta-symbol and must be protected by a backslash if you still want to use the actual point.

    The file name must be exactly between the start and end anchors. This will ensure that only this particular file name and no other will generate an error code.
    [F] is a special 'forbidding' flag.
    [NC] - not case sensitive.
    [OR] - means 'or the next condition'.

Definition of encoding

Determine the encoding in which the server "returns" files

  AddDefaultCharset windows-1251
options: KOI8-R , UTF-8 , Windows-1251

Determining the encoding for uploaded files

  CharsetSourceEnc windows-1251

Setting a password for a directory using .htaccess

To set a password for the directory, you can use the basic authorization system provided in the Apache web server. Create a .htaccess file with the following directives in the directory to which we want to restrict access by password:
  AuthType Basic
 AuthName "Some Name"
 AuthUserFile /www/some_login/www/htdocs/some_dir/.htpasswd
 require valid-user
The path /www/some_login/www/htdocs/some_dir/.htpasswd denotes the full path to the password file on the disk of our server. If, for example, you put the .htpasswd file (there will be passwords) in the home directory where you are going by going to the server via FTP, then the path to this file will be /www/some_login/www/htdocs/some_dir/.htpasswd , where some_login is your login. In the AuthUserFile directive, we specify the absolute path to the file with logins / passwords, which we will create a little later. If you create a .htaccess file on your computer and not immediately on the server using a text editor, pay special attention to the fact that .htaccess must be transmitted via FTP strictly in text (ASCII) mode.

Create a password file. The password file must contain a string of the form login: password. The password must be encrypted using the MD5 algorithm. One way to create such a file is to use the program included in the Apache delivery - htpasswd (on our server it is in the / usr / local / apache / bin directory, the full path is / usr / local / apache / bin / htpasswd).

Consider how to create a password file in the unix shell directly on the server. We go into the shell and execute the following commands:

  htpasswd -mbc .htpasswd user1 7B1safkir
- create a new .htpasswd file into which we add an entry for user1 with the password specified on the command line.
  htpasswd .htpasswd user2
- add to the existing .htpasswd file user2, and enter the password manually in response to the corresponding program request.

After the end of the institution all logins must be uploaded to the server. About other ways to set passwords per page

Define your own error pages

You can set your own error page as follows:
  ErrorDocument 404
IE ignores pages smaller than 512 bytes.

Indexing directories and subdirectories

To avoid indexing search engines directories and subdirectories, you need to write this line, for example:
  DirectoryIndex index.php
This directive specifies the file to be called when accessing the directory without specifying a file name.

You can specify multiple index pages. When a directory is requested, they will be searched in the order listed in the DirectoryIndex directive. If the index.html file is not found, the index.php file will be searched, etc.

  DirectoryIndex index.html index.php index.shtml

Personally, I prefer to redirect from empty directories either to the main page of the site, or to some other suitable page. For example, the directory can be redirected to

Protect images from downloading

Very often it happens that webmasters impudently copy content from your site along with drawings, and drawings are downloaded from your own server. This creates unnecessary traffic, which often leads to a number of problems. How can you protect yourself from such webmasters and not prevent search engines from indexing images? It's simple:
  RewriteEngine on
 RewriteCond% {HTTP_REFERER}.
 RewriteCond% {HTTP_REFERER}! ^ Http: // ([^.] + \.)? Site \.  [NC]
 RewriteCond% {HTTP_REFERER}! Google \.  [NC]
 RewriteCond% {HTTP_REFERER}! Search \? Q = cache [NC]
 RewriteCond% {HTTP_REFERER}! Msn \.  [NC]
 RewriteCond% {HTTP_REFERER}! Yahoo \.  [NC]
 RewriteCond% {REQUEST_URI}! ^ / Hotlinker \ .gif $
 RewriteRule \. (Gif | jpg | png) $ /hotlinker.gif [NC, L]
hotlinker.gif - the image that will be displayed, instead of the true images. I recommend in this image to display your logo and link to your site.

Another variation of the prohibition of access to pictures from unauthorized sites:

  SetEnvIfNoCase Referer "^ $" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Htmlweb \ .ru" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Images \ .yandex \ .ru" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Hghltd \ .yandex \ .com" local_ref = 1
 Order Allow, Deny
  Allow from env = local_ref

Search engines and all sorts of scanners create collosal traffic on your site. The following code block will prevent access to the site.

  RewriteCond% {HTTP_USER_AGENT} (Googlebot | Slurp | spider | Twiceler | heritrix |
	 Combine | appie | boitho | e-SocietyRobot | Exabot | Nutch | OmniExplorer |
	 MJ12bot | ZyBorg / 1 | Ask \ Jeeves | AskJeeves | ActiveTouristBot |
	 JemmaTheTourist |  agadine3 | BecomeBot | Clustered-Search-Bot |
	 MSIECrawler | freefind | galaxy | genieknows | INGRID | grub-client |
	 MojeekBot | NaverBot | NetNose-Crawler | OnetSzukaj | PrassoSunner |
	 Asterias \ Crawler | THUNDERSTONE | GeorgeTheTouristBot |
	 VoilaBot | Vagabondo | fantomBro wser | stealthBrowser | cloakBrowser |
	 fantomCrew \ Browser | Girafabot | Indy \ Library | Intelliseek | Zealbot |
	 Windows \ 95 | ^ Mozilla / 4 \ .05 \ \ [en \] $ | ^ Mozilla / 4 \ .0 $) [NC]
 RewriteRule ^ (. *) $ - [F]
 RewriteCond% {HTTP_USER_AGENT} ^ Mozilla. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Opera. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Firefox. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Netscape. * [NC]
 RewriteRule ^ (. *) $ - [L]
 RewriteRule ^ (. *) $ - [F]

Tracking calls to your robots.txt file

To have more information about visiting search engines, it is useful to have detailed information about accessing the robots.txt file. In order to formalize this, in '.htaccess' there should be the following entries:
  RewriteEngine on
  Options + FollowSymlinks
  RewriteBase /
  RewriteRule ^ robots.txt $ /robot.php?%{REQUEST_URI}
Now when requesting the 'robots.txt' file, our RewriteRule redirects the visitor (robot) to the robot.php requesting script. In addition, the variable is passed to the script, which will be processed according to your needs. 'REQUEST_URI' specifies the name of the requested file. In this example, this is 'robots.txt'. The script will read the contents of 'robots.txt' and send it to the web browser or search engine robot. Thus, we can count hits of visitors and log files.


To disable the addition of PHPSESSID to the URL, put in the beginning index.php:

  ini_set ("session.use_trans_sid", 0);

Either in .htaccess, write:

  php_flag session.use_trans_sid Off

If you find this all difficult, use the ready service of converting dynamic URLs to static URLs using htaccess

Cache directives

Caching for all file types by access time
  ExpiresActive on
 ExpiresDefault "access plus 600 seconds"
Caching for all file types by time of change
  ExpiresActive on
 ExpiresDefault "modification plus 600 seconds"
Caching for certain file types
  ExpiresByType text / css "modification plus 600 seconds"
 ExpiresByType image / jpeg "modification plus 600 seconds"
 ExpiresByType image / gif "modification plus 600 seconds"
 ExpiresByType image / x-ico "modification plus 600 seconds"
 ExpiresByType image / png "modification plus 600 seconds"

Disallowing caching with the Apache server

Open the Apache server configuration file httpd.conf and uncomment the following lines:

 LoadModule expires_module modules /
 LoadModule headers_module modules / 
 AddModule mod_expires.c
 AddModule mod_headers.c

Write in .htaccess the following:

  # Disable caching in this folder
 # You must enable modules
 # mod_headers.c and mod_expires.c
 # Header Cache-Control
 Header append Cache-Control "no-store, no-cache, must-revalidate"
 # Expires headline
 ExpiresActive On
 ExpiresDefault "now"

The necessary headers will be transferred automatically, and you do not need to write them specially in PHP - the cache is already off!

Description of http-header of Cache-control caching

Caching with .htaccess file

  # Cache resolution in this folder
 # You must enable modules
 # mod_headers.c and mod_expires.c
 # Header Cache-Control
 Header append Cache-Control "public"
 # Expires headline
 ExpiresActive On
  ExpiresDefault "access plus 1 hours"
  #ExpiresDefault "access plus 10 years"

Caching javascript files using the .htaccess file

 ExpiresDefault "access plus 3 days"

Be careful when caching, because when changing a file, the user can get a new version only after 3 days!

How do I get html pages to handle php code?

Write the following lines in your .htaccess file:
  RemoveHandler .php .htm .html
 AddHandler application / x-httpd-php .php .htm .html

How to place several sites on one virtual hosting?

To place two or more sites on one virtual hosting, in spite of the number of domains assigned to you, you need to add the following lines in the file ".htaccess":

  RewriteEngine On
 RewriteRule ^ newdirectory / - [L]
 RewriteCond% {HTTP_HOST} (www.)? [NC]
 RewriteRule (. *) Newdirectory / $ 1 [L]

newdirectory / - the folder where the second site will be located - the domain for which we are redirecting

Please note that you will have a single mail account. Those. if, you have a box, then after connecting the domain at the box a second name appears - And when you create any new mailbox (for example, info), it is automatically assigned two names - and

Search for pages in more than one directory

Sometimes it is necessary to let the web server search for pages in more than one directory.

  RewriteEngine on

 # First, try to find it in the specified location / ...
 # ... and if you find that we finish the search:
 RewriteCond / your / docroot / dir1 /% {REQUEST_FILENAME} -f
 RewriteRule ^ (. +) / Your / docroot / dir1 / $ 1 [L]

 # secondly - try to find it in pub / ...
 # ... and if you find that we finish the search:
 RewriteCond / your / docroot / dir2 /% {REQUEST_FILENAME} -f
 RewriteRule ^ (. +) / Your / docroot / dir2 / $ 1 [L]

 # otherwise continue for other directives
 RewriteRule ^ (. +) - [PT]

Virtual user hosts

If you want to provide addresses to for user pages, you can use the following set of rules to convert to the internal path / home / subdomain / path:

  RewriteEngine on
 RewriteCond% {HTTP_HOST} ^ www \. [^.] + \. Ru $
 RewriteRule ^ (. +)% {HTTP_HOST} $ 1 [C]
 RewriteRule ^ www \. ([^.] +) \. Ru (. *) / Home / $ 1 $ 2

Files are damaged when they are uploaded to the server

If the binary data is damaged during the transfer of files through forms (with the specified enctype = "multipart / form-data"), write the directive in /cgi-bin/.htaccess:
  CharsetRecodeMultipartForms Off.

Error loading SWF files.
Errors when accessing pages containing keywords,
type $ _REQUEST

This can happen because of the installed module in Apache. By default, it blocks in the requests rows with SQL arguments and other potentially dangerous commands.

Possible error messages:


You do not have permission to access /adm/index.php on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.
The request is unsafe and was rejected.
Add to .htaccess
 SecFilterEngine Off
 SecFilterScanPOST Off
For the message:
"POST /wp-admin/async-upload.php HTTP / 1.1" 406 354 "-" "Shockwave Flash"
you can unprotect only the download of files to the server:
 SecFilterEngine Off
 SecFilterScanPOST Off

Optimal to remove protection only from the folder in which it is necessary, without removing protection from the entire site.

Server variables

These are variables of the form % {NAME_OF_VARIABLE}

where NAME_OF_VARIABLE can be a string from the following list:

HTTP headers: connection & request:
internal servers: system: special:

These variables completely correspond to the similarly named MIME HTTP headers, and to Apache server variables or the struct tm fields of Unix systems. Those that are special for mod_rewrite include:

IS_SUBREQ - Will contain the text "true" if the query is currently running as a subquery, "false" otherwise. Subqueries can be generated by modules that need to deal with additional files or a URI in order to perform their own tasks.

API_VERSION - This is the Apache API version (the internal interface between the server and the module) in the current server assembly, as defined in include / ap_mmn.h. The API version of the module corresponds to the version of Apache (for Apache version 1.3.14, for example, it is 19990320: 10), but this is mostly interesting for the authors of the modules.

THE_REQUEST - The entire HTTP request string sent by the browser to the server (ie, " GET /index.html HTTP / 1.1 "). It does not include any additional headers sent by the browser.

REQUEST_URI - The resource requested in the HTTP request string.

REQUEST_FILENAME - Full path in the server's file system to a file or script corresponding to this query.


  1. The variables SCRIPT_FILENAME and REQUEST_FILENAME contain the same values, that is, the value of the filename field of the internal request_rec structure of the Apache server. The first name is simply the widely known name of the CGI variable, while the second is a permanent copy of REQUEST_URI (containing the value of the uri field of the request_rec structure).
  2. There is a special format: % {ENV: variable} where the variable can be any environment variable. This is looked for in the internal structures of Apache and (if there is not) by calling getenv () from the Apache server process.
  3. There is a special format: % {HTTP: header} where the header can be any HTTP MIME header name. This is looked up in the HTTP request. Example: % {HTTP: Proxy-Connection} HTTP header value " Proxy-Connection: ".
  4. There is a special format for % {LA-U: variable} of the leading queries that are produced by an internal (URL-based) subquery to determine the final value of the variable . Use this when you want to use a variable for transformations that is actually defined later in any phase of the API, and thus is not available at this stage. For example, when you want to convert the variable REMOTE_USER from the context of the server ( httpd.conf file), you should use % {LA-U: REMOTE_USER} because this variable is set in the authorization phases that follow the URL translation phase in which mod_rewrite works. On the other hand, due to the implementation of mod_rewrite in the context of the directory (.htaccess file) via the fixup phase of the API and because the authorization phases go before this phase, you can just use % {REMOTE_USER} .
  5. There is a special format: % {LA-F: variable} that creates an internal (file-based) subquery to determine the final value of the variable . Basically this is the same as the LA-U format given above.

Home page without duplication

Usually the main page code is physically located in the index.html or index.php file, but the site should open on any of the following requests:,, and .html. But for search engines it's four different URLs! If you do not configure .htaccess correctly, the search engine will add four identical pages to your index. This is a sign of a substandard website. You can avoid this problem with the help of such a code in .htaccess:

  Options + FollowSymLinks
 RewriteEngine on
 RewriteCond% {HTTP_HOST} ^
 RewriteRule (. *) Http://$1 [R = 301, L]
 RewriteCond% {THE_REQUEST} ^ [AZ] {3,9} \ / index \ .html \ HTTP /
 RewriteRule ^ index \ .html $ [R = 301, L]

All duplicate pages will be glued together with a redirect code 301 with the main page -

Duplicate pages without a slash at the end of the URL

To prevent the situation with the indexing of the pages and as different, put the following code:

From pages without a slash, a redirect will be set to "slash".

  RewriteCond% {REQUEST_FILENAME}! -F
 RewriteCond% {REQUEST_URI}! (. *) / $
 RewriteRule ^ (. *) $ / $ 1 / [R = 301, L]

Saving files instead of opening

Many have seen how, when trying to download an archive with a .rar extension, the browser opens it as a simple text from a hash of characters. This means that the server site is not configured to forcefully save file types that should not be opened in the browser.

  AddType application / octet-stream .rar .doc .mov .avi .pdf .xls .mp4