This page has been robot translated, sorry for typos if any. Original content here.

Apache server configurator .htaccess file

On this topic:

Файл-конфигуратор Apache-серверов .htaccess

.htaccess (with a dot at the beginning of the name) is an Apache server configurator file that allows you to configure the server to work in separate directories (folders) without providing access to the main configuration file. For example, set permissions to files in a directory, change the names of index files, independently handle Apache errors, redirecting visitors to special error pages. .htaccess is a plain text document whose extension is htaccess. This file is usually located in the root of the site, but you can also create additional .htaccess files for various directories of your site.

Mod_rewrite is a module used by web servers to convert URLs.

Mod_rewrite module directives

  • Rewritebase
  • RewriteCond
  • RewriteEngine
  • Rewritelock
  • Reriterite
  • RewriteLogLevel
  • Rewritemap
  • RewriteOptions
  • Reriterite

Options for the implementation of the Redirect using the .htaccess file

  1. Simple redirect:
      Redirect 301 /
      redirect / secret
    Put in the file .htaccess or httpd.conf for Apache. The first "/" means that everything from the top level of the site, including all subdirectories, will be redirected (do not forget to put the last "/"). If you want to redirect only the page, keeping the PR of the old page, you can do this:

    Redirect 301 /old/old.htm where:
    /old/old.htm - the path and name of the old page - new path and new name of the moved page

  2. Redirect to any page by the user's ip or when requesting a specific page (as well as by the name mask).
    If the user has ip, then he will be redirected to the user.php page:
      SetEnvIf REMOTE_ADDR REDIR = "redir"
     RewriteCond% {REDIR} redir
     RewriteRule ^ / $ /user.php
  3. Redirect when requesting certain files. If files whose extension is not specified in the .htaccess file (gif and jpg) are requested, the redirection follows:
      RewriteEngine On
     RewriteRule!. (Gif | jpg) $ index.php

  4. Using mod_rewrite:
      Options + FollowSymLinks 
     RewriteEngine on 
     RewriteCond% {HTTP_HOST} ^ yourdomain \ .ru 
     RewriteRule ^ (. *) $ Http://$1 [R = permanent, L]

  5. Regular expression redirect:
      RedirectMatch 301 (. *) Http: //$1 It is registered in the .htaccess file. 
    (. *) RedirectMatch actually matches regular expression patterns after the domain name. Thus, it is impossible to perform the pattern matching on ^ / However, you can convert pages using the .html extension to files of the same name, but with the .php extension:
      RedirectMatch 301 (. *) \. Html $ http: //$1.php
    If you need to do a different redirect for individual pages, you can use the following:
      RedirectMatch Permanent ^ / html / resources.html $ 
     RedirectMatch Permanent ^ / html / other_page.html $ 
     RedirectMatch Permanent ^ / (. *) $ Http:// 
    " RedirectMatch Permanent " is the equivalent of "RedirectMatch 301", the line with "* (Wildcard)" should be the last in this list.

  6. Creating a readable URL
    To convert, for example, to as follows:
      RewriteEngine on
     RewriteRule ^ product /( (//) / product product.php? Id = $ 1 [L]
    In the following example, convert to
      RewriteRule cat /(.*)/(.*)/$ /script.php?$1=$2

  7. Redirect to PHP:
      header ("HTTP / 1.1 301 Moved Permanently"); 
     header ("Location:"); 
     exit (); 
    Naturally, it is necessary to create a page, when accessing which the Redirect will occur, and place it on the server. And better specify HTTP / 1.1 (and not HTTP / 1.0 or HTTP / 0.9, which do not support virtual hosting)

  8. Redirect all files in a folder to one file.
    For example, you no longer need the Super discount site section and want to redirect all requests to the / superdiscount folder to one /hot-offers.php file. To do this, add the following code to .htaccess.
      RewriteRule ^ superdiscount (. *) $ /Hot-offers.php [L, R = 301]

  9. Redirect the entire folder except one file
    In the following example, all files from the / superdiscount folder will be redirected to the /hot-offers.php file, EXCEPT the /superdiscount/my-ebook.html file, which should be redirected to /hot-to-make-million.html
      RewriteRule ^ superdiscount / my-ebook.html /hot-to-make-million.html [L, R = 301]
     RewriteRule ^ superdiscount (. *) $ /Hot-offers.php [L, R = 301]

  10. Redirect dynamic url to new file.
    This option is useful if you want to redirect a dynamic URL with parameters to a new static file.
      RewriteRule ^ article.jsp? Id = (. *) $ /Latestnews.htm [L, R = 301]
    That is, now, a request to the file of the form and / or will be sent to the http file : //

  11. Mass redirect of new files.
    Now let's move to the most difficult moment when you need to redirect a lot of URLs, for example, after changing your CMS. Here a number of problems immediately arise. First of all, adding all the changed addresses to the .htaccess file will take a lot of time, and the very unpleasant thing in itself. Secondly, too many entries in the .htaccess file will slow down the Apache server. And thirdly, if you enter such amount of information, it is highly likely that you will go somewhere else. Therefore, the best way out is to hire a programmer who will write you a dynamic redirect.
    The following example is written in PHP, but can also be executed in any language. Suppose you have switched to a new system of links on your site and all files ending in the old id should be intermixed. First we create a table in the database that contains the old id and the new URL for the redirect. old_id INT new_url VARCHAR (255) Next, write the code that links your old id with the new URLs
    After that, add the following line to .htaccess:
      RewriteRule ^ / product - (. *) _ ([0-9] +). Php /redoldold.php?productid=$2
    then create a PHP file redirectold.php that will support 301 redirects:
      <? php
     function getRedirectUrl ($ productid) {
     // Connect to the database
     $ dServer = "localhost";
     $ dDb = "mydbname";
     $ dUser = "mydb_user";
     $ dPass = "password";
     $ s = @mysql_connect ($ dServer, $ dUser, $ dPass)
     or die ("Couldn't connect to database server");
     @mysql_select_db ($ dDb, $ s)
     or die ("Couldn't connect to database");
     $ query = "SELECT new_url FROM redirects WHERE old_id =".  $ productid;
     mysql_query ($ query);
     $ result = mysql_query ($ query);
     $ hasRecords = mysql_num_rows ($ result) == 0?  false: true;
     if (! $ hasRecords) {
     $ ret = '';
     } else {
     while ($ row = mysql_fetch_array ($ result))
     $ ret = ''.  $ row ["new_url"];
     mysql_close ($ s);
     return $ ret;
     $ productid = $ _GET ["productid"];
     $ url = getRedirectUrl ($ productid);
     header ("HTTP / 1.1 301 Moved Permanently");
     header ("Location: $ url");
     exit ();

    Now all requests to your old URLs will call redirectold.php, which will find the new URL and return 301 responses with your new link.

    Redirects depending on time

    When it is necessary to use tricks like the content of a time-dependent mass of webmasters, they still use CGI scripts that redirect to special pages. How can this be done via mod_rewrite?

    There are many variables called TIME_xxx for redirect conditions. In conjunction with special lexicographic samples for comparison STRING and = STRING we can produce time-dependent redirects:
      RewriteEngine on RewriteCond% {TIME_HOUR}% {TIME_MIN}> 0700 RewriteCond% {TIME_HOUR}% {TIME_MIN} <1900 RewriteRule ^ foo \ .html $ RewriteRule ^ foo \ .html $ 

    This gives the contents of when requesting the URL foo.html from 07:00 to 19:00 and for the rest of the time the contents of foo.night.html.

  12. We remove all requests from the beginning "WWW."
     RewriteEngine on # announce that we want to use mod_rewrite
     RewriteCond% {HTTP_HOST} ^ www \. (. *) [NC]
     RewriteRule ^ /? (. *) Http: //% 1 / $ 1 [L, R = permanent]

  13. Change the extension .html to .php
    Sometimes it happens that you have a static website, and you need to run some php script on it. To do this, you need to tell the server to process this page as a php file.
     AddHandler application / x-httpd-php .html
    This technique can also be used for other file extensions:
     AddHandler application / x-httpd-php .xml
     AddHandler application / x-httpd-php .asp

Deny access to a specific directory

  1. for all to all files in the directory:
      deny from all
  2. to a specific file:
     deny from all
  3. by user ip:
      order deny, allow
     deny from all
     allow from
    Access to this directory will be allowed only to a user with ip

    And if you want the opposite, to prohibit individual ip users from accessing your site, then we will write the following lines:

     order allow, deny
     allow from all
     deny from
     deny from 123.456.177
  4. The Options -Indexes directive - a ban on displaying the contents of a directory in the absence of an index file Sometimes you need to make sure that if there is no file in the directory that is shown by default, no list of files in the directory will be displayed. Then you can add this line to .htaccess:
      Options -Indexes
    In this case, instead of a list of files in the directory, the visitor will receive an HTTP error 403 - access forbidden.
  5. Deny access to files with several types of extensions
     deny from all
    Access to files with the extension * .inc, * .conf and * .cfg is prohibited. Although the directive, by default, does not work with regular expressions, you can turn them on by putting the tilde (~) character in the directive options. The syntax is as follows: [tilde] [space] [further_all_non_there] To block this access, we write the following:
      RewriteRule ^ .htaccess $ - [F]
    This rule translates as:
    If someone tries to access the .htaccess file, the system should produce the error code 'HTTP response of 403' or '403 Forbidden - have no permission to access /.htaccess on this server'.

    The construction of ^ .htaccess $ in this regular expression means:
    ^ - anchor start line
    $ - end of line anchor
    . - in regular expressions a dot '.' denotes a meta symbol and must be protected by a backslash if you still want to use the actual point.

    The file name must be located exactly between the initial and final anchor. This will ensure that only this specific file name and no other will generate an error code.
    [F] is a special 'forbidden' flag.
    [NC] - ignore case of letters.
    [OR] - means 'or the following condition'.

Encoding definition

Determining the encoding in which the server "gives" files

  AddDefaultCharset windows-1251
Options: KOI8-R , UTF-8 , Windows-1251

Definition of encoding for downloadable files

  CharsetSourceEnc windows-1251

Setting a password for a directory using .htaccess

To set a password for a directory, you can use the basic authorization system provided in the Apache web server. Create in the directory to which we want to restrict access by password, an .htaccess file with the following directives:
  AuthType Basic
 AuthName "Some Name"
 AuthUserFile /www/some_login/www/htdocs/some_dir/.htpasswd
 require valid-user
The path /www/some_login/www/htdocs/some_dir/.htpasswd means the full path to the password file on our server disk. If, for example, you place the .htpasswd file (there will be passwords in it) to the home directory where you get by logging into the server via FTP, then the path to this file will look like /www/some_login/www/htdocs/some_dir/.htpasswd where some_login is your login. In the directive AuthUserFile specify the absolute path to the file with logins / passwords, which we will create a little later. If you create a .htaccess file on your computer, and not immediately on the server using a text editor, pay special attention to the fact that .htaccess should be transferred via FTP strictly in text (ASCII) mode.

Create a password file. The password file must contain lines of the form login: password. The password must be encrypted using the MD5 algorithm. One way to create such a file is to use the program included in the Apache distribution - htpasswd (on our server it is located in the / usr / local / apache / bin directory, the full path is / usr / local / apache / bin / htpasswd).

Consider how to create a password file in unix shell directly on the server. Go to the shell and execute the following commands:

  htpasswd -mbc .htpasswd user1 7B1safkir
- create a new .htpasswd file in which we add an entry for user1 with the password specified on the command line.
  htpasswd .htpasswd user2
- add the user2 user2 to the already existing .htpasswd file, and manually enter the password in response to the corresponding program request.

After the end of the institution of all logins file must be uploaded to the server About other ways to set passwords on the page

Set your own error pages

You can set your own error page as follows:
  ErrorDocument 404
IE ignores pages smaller than 512 bytes.

Indexing directories and subdirectories

To avoid indexing directories and subdirectories by search engines, you need to prescribe such a string, for example:
  DirectoryIndex index.php
This directive sets the file that will be called when accessing the directory without specifying the file name.

You can specify multiple index pages. When a directory is requested, they will be searched in the order listed in the DirectoryIndex directive. If the index.html file is not found, the index.php file, etc. will be searched.

  DirectoryIndex index.html index.php index.shtml

Personally, I prefer to redirect from empty directories either to the main page of the site or to any other suitable page. For example, the directory can be redirected to

Protecting images from downloading

It often happens that webmasters brazenly copy content from your site along with pictures, and the pictures are uploaded from your own server. This creates extra traffic, which often leads to a number of problems. How to protect yourself from such webmasters and not prevent search robots from indexing images? It's simple:
  RewriteEngine on
 RewriteCond% {HTTP_REFERER}.
 RewriteCond% {HTTP_REFERER}! ^ Http: // ([^.] + \.)? Site \.  [NC]
 RewriteCond% {HTTP_REFERER}! Google \.  [NC]
 RewriteCond% {HTTP_REFERER}! Search \? Q = cache [NC]
 RewriteCond% {HTTP_REFERER}! Msn \.  [NC]
 RewriteCond% {HTTP_REFERER}! Yahoo \.  [NC]
 RewriteCond% {REQUEST_URI}! ^ / Hotlinker \ .gif $
 RewriteRule \. (Gif | jpg | png) $ /hotlinker.gif [NC, L]
hotlinker.gif - the image that will be displayed, instead of the true images. I recommend in this image to display your logo and link to your site.

Another way to ban access to images from unresolved sites:

  SetEnvIfNoCase Referer "^ $" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Htmlweb \ .ru" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Images \ .yandex \ .ru" local_ref = 1
 SetEnvIfNoCase Referer "^ http: // (www \.)? Hghltd \ .yandex \ .com" local_ref = 1
 Order Allow, Deny
  Allow from env = local_ref

Search engines and all sorts of crawlers generate huge traffic on your site. The following code block will allow access to bots on the site.

  RewriteCond% {HTTP_USER_AGENT} (Googlebot | Slurp | spider | Twiceler | heritrix |
	 Combine | appie | boitho | e-SocietyRobot | Exabot | Nutch | OmniExplorer |
	 MJ12bot | ZyBorg / 1 | Ask \ Jeeves | AskJeeves | ActiveTouristBot |
	 JemmaTheTourist |  agadine3 | BecomeBot | Clustered-Search-Bot |
	 MSIECrawler | freefind | galaxy | genieknows | INGRID | grub-client |
	 MojeekBot | NaverBot | NetNose-Crawler | OnetSzukaj | PrassoSunner |
	 Asterias \ Crawler | THUNDERSTONE | GeorgeTheTouristBot |
	 VoilaBot | Vagabondo | fantomBro wser | stealthBrowser | cloakBrowser |
	 fantomCrew \ Browser | Girafabot | Indy \ Library | Intelliseek | Zealbot |
	 Windows \ 95 | ^ Mozilla / 4 \ .05 \ \ [en \] $ | ^ Mozilla / 4 \ .0 $) [NC]
 RewriteRule ^ (. *) $ - [F]
 RewriteCond% {HTTP_USER_AGENT} ^ Mozilla. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Opera. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Firefox. * [NC, OR]
 RewriteCond% {HTTP_USER_AGENT} ^ Netscape. * [NC]
 RewriteRule ^ (. *) $ - [L]
 RewriteRule ^ (. *) $ - [F]

Tracking robots.txt file access

In order to have more information about visiting search engines, it is useful to have detailed information on accessing the robots.txt file. In order to organize this, the following entries should be in '.htaccess':
  RewriteEngine on
  Options + FollowSymlinks
  RewriteBase /
  RewriteRule ^ robots.txt $ /robot.php?%{REQUEST_URI}
Now, when requesting the 'robots.txt' file, our RewriteRule redirects the visitor (robot) to the robot.php script that processes the requests. In addition, the variable is passed to the script, which will be processed according to your needs. 'REQUEST_URI' defines the name of the requested file. In this example, this is 'robots.txt'. The script will read the contents of 'robots.txt' and send it to a web browser or a search engine bot. Thus, we can count visitors hits and maintain log files.


To disable adding PHPSESSID to the URL, paste index.php at the beginning:

  ini_set ("session.use_trans_sid", 0);

Or in .htaccess list:

  php_flag session.use_trans_sid Off

If it all seemed difficult to you, use the ready Dynamic URL to Static Service with htaccess

Caching directives

Caching for all file types by access time
  ExpiresActive on
 ExpiresDefault "access plus 600 seconds"
Caching for all types of files at the time of change
  ExpiresActive on
 ExpiresDefault "modification plus 600 seconds"
Caching for certain file types
  ExpiresByType text / css "modification plus 600 seconds"
 ExpiresByType image / jpeg "modification plus 600 seconds"
 ExpiresByType image / gif "modification plus 600 seconds"
 ExpiresByType image / x-ico "modification plus 600 seconds"
 ExpiresByType image / png "modification plus 600 seconds"

Disable Apache Caching

Open the Apache server configuration file httpd.conf and uncomment the following lines:

 LoadModule expires_module modules /
 LoadModule headers_module modules / 
 AddModule mod_expires.c
 AddModule mod_headers.c

Enter the following in .htaccess:

  # Prohibition of caching in this folder
 # It is necessary to enable modules
 # mod_headers.c and mod_expires.c
 # Cache-Control Header
 Header append Cache-Control "no-store, no-cache, must-revalidate"
 # Expires header
 ExpiresActive On
 ExpiresDefault "now"

The necessary headers will be transmitted automatically, and you no longer need to write them specifically in PHP - the cache is already turned off!

Cache-control caching http-header description

Caching with the .htaccess file

  # Enable caching in this folder
 # It is necessary to enable modules
 # mod_headers.c and mod_expires.c
 # Cache-Control Header
 Header append Cache-Control "public"
 # Expires header
 ExpiresActive On
  ExpiresDefault "access plus 1 hours"
  #ExpiresDefault "access plus 10 years"

Caching javascript files using an .htaccess file

 ExpiresDefault "access plus 3 days"

Be careful when caching, because when changing the file, the user can get a new version only after 3 days!

How to force html-pages to process php-code?

Write the following lines in your .htaccess file:
  RemoveHandler .php .htm .html
 AddHandler application / x-httpd-php .php .htm .html

How to host multiple sites on a single shared hosting?

To place two or more sites on the same virtual hosting, contrary to the number of domains allocated to you by the tariff plan, you need to write the following lines in the ".htaccess" file:

  RewriteEngine On
 RewriteRule ^ newdirectory / - [L]
 RewriteCond% {HTTP_HOST} (www.)? [NC]
 RewriteRule (. *) Newdirectory / $ 1 [L]

newdirectory / - folder in which the second site will be located - the domain for which we do the redirection

Please note that in this case you will have a single mail account. Those. if you have a box, then after connecting the domain, a second name appears at the mailbox - And when creating any new mailbox (for example, info), two names are automatically assigned to it - and

Search pages in more than one directory

Sometimes it is necessary to allow the web server to search pages in more than one directory.

  RewriteEngine on

 # first try to find it in the specified place / ...
 # ... and if found then finish the search:
 RewriteCond / your / docroot / dir1 /% {REQUEST_FILENAME} -f
 RewriteRule ^ (. +) / Your / docroot / dir1 / $ 1 [L]

 # secondly - we will try to find it in pub / ...
 # ... and if found then finish the search:
 RewriteCond / your / docroot / dir2 /% {REQUEST_FILENAME} -f
 RewriteRule ^ (. +) / Your / docroot / dir2 / $ 1 [L]

 # otherwise we continue for other directives
 RewriteRule ^ (. +) - [PT]

User virtual hosts

If you want to provide addresses for user pages, you can use the following rule set to convert to the internal path / home / subdomain / path:

  RewriteEngine on
 RewriteCond% {HTTP_HOST} ^ www \. [^.] + \. En $
 RewriteRule ^ (. +)% {HTTP_HOST} $ 1 [C]
 RewriteRule ^ www \. ([^.] +) \. Ru (. *) / Home / $ 1 $ 2

Files damaged when uploaded to server

If when transferring files through forms (at the specified enctype = "multipart / form-data") binary data is corrupted, write to the /cgi-bin/.htaccess directive:
  CharsetRecodeMultipartForms Off.

Error loading SWF files.
Errors when accessing pages containing keywords
type $ _REQUEST

This may be due to the installed module. in apache. By default, it blocks rows with SQL arguments and other potentially dangerous commands in queries.

Possible error messages:


You do not have permission to access /adm/index.php on this server. Additionally, it is a 404
The request is not safe and was rejected.
Add to .htaccess
 SecFilterEngine Off
 SecFilterScanPOST Off
For the message:
"POST /wp-admin/async-upload.php HTTP / 1.1" 406 354 "-" "Shockwave Flash"
You can remove protection only for uploading files to the server:
 SecFilterEngine Off
 SecFilterScanPOST Off

Optimally remove protection only from the folder in which it is necessary, without removing protection from the entire site.

Server variables

These are variables like % {NAME_OF_VARIABLE}

where NAME_OF_VARIABLE can be a string taken from the following list:

HTTP headers: connection & request:
internal servers: systemic: special:

These variables fully correspond to similarly named HTTP MIME headers, Apache server variables, or Unix system struct tm fields. Those that are special for mod_rewrite include:

IS_SUBREQ - Will contain the text “true” if the query is currently being executed as a subquery, “false” otherwise. Subqueries can be generated by modules that need to deal with additional files or URIs in order to perform their own tasks.

API_VERSION - This is the Apache module API version (internal interface between the server and the module) in the current server build, which is defined in include / ap_mmn.h. The API version of the module corresponds to the version of Apache being used (for Apache version 1.3.14, for example, 19990320: 10), but this is mainly interesting to the authors of the modules.

THE_REQUEST - The full HTTP request string sent by the browser to the server (i.e., " GET /index.html HTTP / 1.1 "). It does not include any additional headers sent by the browser.

REQUEST_URI - The resource requested in the HTTP request line.

REQUEST_FILENAME - Full path in the server file system to the file or script corresponding to this request.


  1. The variables SCRIPT_FILENAME and REQUEST_FILENAME contain the same values, i.e., the value of the filename field in the Apache request_rec internal structure. The first name is simply the well-known name of the CGI variable while the second is the permanent copy of REQUEST_URI (containing the value of the uri field of the request_rec structure).
  2. There is a special format: % {ENV: variable} where the variable can be any environment variable. This is looked up in the internal Apache structures and (if there is not) by calling getenv () from the Apache server process.
  3. There is a special format: % {HTTP: header} where the header can be any HTTP name of the MIME header. This is looked up in an HTTP request. Example: % {HTTP: Proxy-Connection} The HTTP value of the " Proxy-Connection: " header.
  4. There is a special format % {LA-U: variable} of forward requests that are produced by an internal (URL-based) subquery to determine the final value of the variable . Use this when you want to use a variable for transformations, which is actually determined later, in some phase of the API, and thus not available at this stage. For example, when you want to convert the REMOTE_USER variable from the server context ( httpd.conf file), you should use % {LA-U: REMOTE_USER} because this variable is set in authorization phases that follow the translation phase of the URL in which mod_rewrite works. On the other hand, due to the implementation of mod_rewrite in the context of a directory (.htaccess file) via the Fixup API phase and because the authorization phases go to this phase, you can simply use % {REMOTE_USER} there .
  5. There is a special format: % {LA-F: variable} which creates an internal (based on the file name) subquery to determine the final value of the variable . This is basically the same as the LA-U format above.

Home page without duplication

Usually, the homepage code is physically located in the index.html or index.php file, but the site should open on any of the requests:,, and .html. But for search engines, these are four different URLs! If you do not configure .htaccess correctly, the search engine will add four identical pages to its index. This is a sign of poor quality site. To avoid this problem, you can use this code in .htaccess:

  Options + FollowSymLinks
 RewriteEngine on
 RewriteCond% {HTTP_HOST} ^
 RewriteRule (. *) Http://$1 [R = 301, L]
 RewriteCond% {THE_REQUEST} ^ [AZ] {3.9} \ / index \ .html \ HTTP /
 RewriteRule ^ index \ .html $ [R = 301, L]

All pages-duplicates will be glued with a redirect with the code 301 with the main page -

Duplicate pages without a slash at the end of the URL

To prevent the situation with indexing pages and as different, put the following code:

Pages without a slash will be redirected to "slashes."

  RewriteCond% {REQUEST_FILENAME}! -F
 RewriteCond% {REQUEST_URI}! (. *) / $
 RewriteRule ^ (. *) $ / $ 1 / [R = 301, L]

Saving files instead of opening

Many have seen how when trying to download an archive with the .rar extension, the browser opens it in the form of plain text from a jumble of characters. This means that the site server is not configured to force saving file types that should not open in the browser.

  AddType application / octet-stream .rar .doc .mov .avi .pdf .xls .mp4