As you know, the computer is not a TV set, and the desire to improve it appears on the second day after the purchase. The first thing that comes to mind is to optimize Windows. Fortunately, there is a lot of materials on this subject both on the Web and on the pages of computer publications. But is it worth it to immediately attack the system and can you always get a positive result? In order to save the nerve cells, and in some cases, the money of ordinary PC users, I propose to understand some of the issues of Windows in more detail. So, let's begin.

Never, under any circumstances, make changes to the registry and Windows system files without having a backup copy of the latter. There are many ways to reserve the system with standard Windows tools, and with the help of special programs. This issue has been repeatedly covered in the pages of the computer press and I hope that every reader already has the necessary experience in the field of reserving the registry. And for those who wish to create their backup program without learning programming languages, I propose to visit the page of this site , which details the creation of command files for backup and recovery of the system. We will assume that you already have a backup, and we can continue our research.

Optimizing the hard disk cache.

According to many experts in the field of Windows OS, the size of the cache should be fixed and completely depends on the amount of RAM installed on the computer. Like, Windows does not know how to work with the cache and spends too much on it. There are even standard schemes for determining the size of the cache, based on the size of the memory. All this is prescribed in the [vcache] section of the system.ini file in the form MinFileCache = size in kb and MaxFileCache = size in kb. Such an approach will really save some memory, but it can lead to a general slowdown of the system, especially on home computers, where dozens of various programs, both multimedia and office, can be executed during the day, and each of them requires different cache sizes and operational Memory. It's unlikely that you will enjoy the editing of system.ini and reboot the computer before each launch of a new program. The universal size of the disk cache, in my opinion, is a purely theoretical possibility, as in some cases there will still be an excess of cache and a lack of RAM, in others - on the contrary. And tomorrow the little son will bring some game with non-standard use of memory resources and system.ini will have to be edited again? Modern versions of Windows are able to execute the program code directly from the cache, i.e. The cache has ceased to be an intermediate link between the hard drive and memory, and is nothing more than part of the RAM. So what do we try to limit? Conclusion: Limiting the size of the hard disk cache in most cases leads to a decrease in system performance. The only exceptions are computers that perform similar tasks, mainly related to the transfer of large amounts of data. Here, optimizing the size of the cache really helps to achieve maximum system performance.

Virtual memory.

Most of the advice on this issue was born during the victorious procession of Windows 95, when the new graphical interface required additional and quite expensive RAM. Why are these ancient councils dragged over the gray beard to new platforms? After all, modern versions of Windows work with virtual memory in a completely different way. So, what do we propose to create with the swap file? Yes, everything is the same as for ancient windows, namely: to make its size fixed and equal to 3-4 sizes of RAM. The need for these actions is usually explained by the fact that the OS spends almost an eternity on changing the size of the swap file, the data is excessively fragmented, and thus the system is slowed down. I foresee a general indignation, but I undertake to assert that such explanations, if not complete nonsense, have, at least for a long time, lost their relevance. If you deprive Windows of the ability to determine the required size of the paging file yourself, you risk getting a message about the impossibility of running certain programs. The risk increases if you use multitasking Windows. For example, you can easily have a situation where you will need simultaneous work with Photoshop, word processor, HTML editor and some other animator. Where is the guarantee that at this time you will not need to open a graphic file of several tens of Mb for further editing? Imagine that at this point the system will crash due to the fact that Windows can not increase the size of virtual memory, and the results of their work you for some reason did not save? It's not funny anymore. I want to draw the attention of distinguished readers to the following fact. Windows changes the size of the paging file dynamically and basically when the system resources are relatively free and access to the disk does not cause any inconvenience. After the current task is completed, the size of the paging file remains unchanged for a period of time. If at this time you go to prepare yourself another cup of coffee, you will not even notice any action from the OS, but if you prefer to continue working, you are unlikely to sit idly by these 2.5 minutes, and run the next program. In addition, the need for virtual memory for modern programs is reduced by several times due to the use of the so-called principle of direct reading (Linear Executable). Such programs are not loaded into memory completely, but they map their code to memory pages and load the necessary libraries as needed. Thus, the most complete and optimal use of both operational and virtual memory is provided. Doubtful are the assumptions that fixing the size of the paging file will avoid unnecessary fragmentation of data. After all, inside the file itself, the data will still be fragmented, perhaps even more so than with normal use. As for moving the swap file to the beginning of the disk or to a separate physical disk, such methods really have a right to life. But the effect of this you can hardly notice. To implement such events, you will need special expensive utilities like the famous package of Mr. Norton, which will inevitably prescribe their programs to autoload, thereby compensating for improved virtual memory by reducing physical. To avoid this, you will have to optimize the utilities themselves. In addition, working with utilities requires at least an elementary understanding of what happens when they are used. Especially dangerous are the automated functions. I have repeatedly had to restore the system after using Norton Utilities . The reason is ridiculously simple - wrong regional settings. The matter is that the Norton Disk Doctor program , which is part of the utilities, reads the country code. And if your machine is installed Russian version of Windows, and regional settings, say, the US, then the program will calculate all the Russian-language names of files and folders for the error. The result, I think, is understandable. And this is just a small part of the possible problems. Conclusion: Modern versions of Windows do not need to optimize virtual memory. And if you still decide to move the swap file to the beginning of the disk or to a separate disk, then do not forget to study the prices for the licensed utilities + additional hard disk. In my opinion, the line of memory will be much cheaper.

Internet and modem.

Advanced Windows users advise adding some parameters to the registry key HKEY_LOCAL_MACHINE \ System \ CurrentControlSet \ Services \ Class \ NetTrans \ 0000 (maybe 0001, etc.) that directly affect the speed of the modem. The main parameter is MaxMTU . Let me remind you that MTU (Maximum Transmition Unit) is the maximum size of a data packet that can be transmitted over a network for one physical frame. The optimal value is MaxMTU = 576. But forgive, the optimal - for what? To answer this question, I propose to conduct a small experiment. Let's use the guest connection to the Internet, provided by one of the most popular metropolitan providers - Svit Online . To connect to a remote computer, we will use the partially forgotten Hyper Terminal program , dial-up number 490-0-490 , login- svit , password - online . And what will we see after entering the password? The remote computer tells us the assigned IP address and ... MTU = 1500! Now you understand why MaxMTU = 576 is optimal? Not otherwise, how to slow down the transfer of data. For optimal transmission, it turns out, one must start from the value of 1500. I will not dwell on the calculation of other parameters, such as MSS , TTL , because I think all these measures are far from harmless, given the cost of the services of providers and the per-minute payment for a city phone. Windows perfectly copes with the task of automatically determining MTU itself, without our intervention. It is better to focus on increasing the quality of the communication line, at least within your own apartment. More often the reason of deterioration of communication are all sorts of twists, bad contacts and set of parallel telephone sets, instead of iron and soft. But this is a topic for a separate conversation.

The fate of being optimized befell Windows Me. What is not offered here: and remove PC Health, and disable System Restore, and get rid of Media Player 7 with Movie Maker, and replace IE 5.5 with an older version, and even enter a real DOS mode. And all this only in order to install a new OS on a PC with a Pentium 133 MHz and 32 MB of RAM. But then what will remain of it? Which Windows Me platform are you talking about? After all, even if you manage to place the engine from "Mercedes" in "Zaporozhets", he will not become "Mesredes" from this. As a result - additional failures, inconvenience in work, and more often - format C :. Here is the user and will believe stories about the legendary buggy Windows. More precisely, he will not believe, but will check on his own bitter experience.

In conclusion, I want to add that I am not a supporter of "clean" operating systems. But optimizing Windows requires thoughtful actions. You have to be creative to optimize. Otherwise, "optimization" will lead to the collapse of the system. And who, then, is to blame for this? Really yourself? Of course not. It's all Windows, the most buggy system. Smile happy