Why does restarting a computer fix things?

19,708

Solution 1

Basically because anything that's got in a mess gets the chance to start over. Imagine you're making toast and you burn it. Throwing it away and starting again is one way to fix that problem and will always work out better than scraping the burnt bits of toast off.

Solution 2

One of the major reasons your computer slows down is that its Random Access Memory (RAM) is being used. The operating system, as well as the programs you're running, all use RAM. However, there's only so much of it, and it can only be accessed so fast. If your computer is trying to use a lot of RAM (often more than is availablee), it slows down. It needs to create files extra swap files on the hard drive to act as extra, but less efficient, "RAM". This, among other things, makes your computer slow down.

Closing some programs should free up RAM space, but memory leaks may have occurred. That means that the program may have accidentally taken up RAM that it didn't/couldn't free up when it closed. "Ahhh" you say, "it's going to eat up all my RAM!" Nope. If you restart the computer, all the RAM is cleared out. You've got more avaliable RAM, so your computer can run faster.

There are other problems that could be fixed by a restart, too. For example, if a program somehow begins to use a huge amount of processor cycles (each cycle consists of a calculation, and all of these calculations make your computer "compute", aka work). When the computer is restarted, the control of the processor is unconditionally given to the bootloader, and then it's handed off to the OS, which can start from scratch. It's no longer being dominated by the greedy program.

Yet another possibility is that the computer was overheating. Overheating, simply put, isn't good for the computer. Turning a the machine off and leaving it to cool for a few minutes couldn't hurt. In fact, some (if not all) computers are set to shut down if they reach a certain internal temperature.

In summary, a restart puts the computer into a state where the right software is controlling the right (possibly cooler) hardware, in a state this is already known to work right.

Solution 3

Two reasons:

  • The OS and software gets to start with a clean slate
  • Any OS / driver updates or installs that have occurred since the latest reboot may need a chance to be part of the boot sequence

Solution 4

Good question! The short answer is "it depends"

The longer answer is that Windows has limited resources for applications to use (Memory, Window Handles, File Handles etc.) if a badly written application doesn't give these resources back to Windows when it's finished you Windows run out of resources. This causes problems with other applications. Obviously the same applies to all other operating systems too

Solution 5

I know this is an ancient thread, but I feel like this post by a Microsoft developer explains why:

  1. Restarts are often necessary after software upgrades/changes.
  2. This is by design.
  3. This is the way it should be.
  4. This is better than the alternative (and how the alternative works).

Gradual slowness and other restart-needing issues can often be chalked up to memory leaks. Contrary to @user2630's comments, this is still a very real problem in modern Windows. Either from services/system components that stay running, preventing their memory being reclaimed on quit, or just from a plurality of running applications that a user started, leaks occur all the time--sometimes severely. In the latter case of running applications, it's often just simpler for an IT guy to say "just restart it", instead of "close all of your apps, check the task tray to make sure they're really gone, make sure they're not running any background processes or services..." you get the idea.

As was mentioned elsewhere here, a lot of other restart-needing problems are from plain old bad/broken software (hung services, infinite waiting on shared resources, etc etc.). I think that leaks and pending library changes explain the majority of the boilerplate-restart-troubleshooting out there, though.

Share:
19,708

Related videos on Youtube

Atmocreations
Author by

Atmocreations

Updated on September 17, 2022

Comments

  • Atmocreations
    Atmocreations almost 2 years

    The title say sit all, but why does restarting a computer tend to fix things? It seems like IT folks always ask, "Have you restarted your PC?" But Why?

    • Atmocreations
      Atmocreations about 14 years
      And yes.... This was a lame ploy to get some rep on SU when the site launched...
  • John Fouhy
    John Fouhy almost 15 years
    I once had a Dell Inspiron with a Pentium 4 inside. In summer, it would occasionally switch off without warning. It turned out dust had built up inside, causing it to heat up until it hit 75 degrees celsius, which is the temperature at which P4s automatically switch off..
  • geocoin
    geocoin almost 15 years
    and tasty too! like the pizza you dropped before it went in the oven... far better to not pick up all the grated cheese and tomato sauce.. oh wait i think i went too far..
  • geocoin
    geocoin almost 15 years
    my wife worked in a place where 'have you tried turning it off and on' was the official first response. she had a problem that caused her desktop to blue screen causing loss of work on a regular basis, however she could never get a fix as 'turning it off and on' always 'fixed' the bluescreen!
  • Tom Robinson
    Tom Robinson almost 15 years
    Can anyone come up with a better but similar analogy? I'm not 100% happy with this one.
  • Jeow Li Huan
    Jeow Li Huan almost 15 years
    The memory leak issue isn't really that relevant with any NT based (Windows 2000 and onwards) or Linux OS. Sure it used to be the case for DOS, but modern OS's will recover all them memory a program was allocated, leaked or not, when it closes*. It's theoretically an issue for services and the like, but these are generally pretty solid in the first place. * Because the memory allocation algorithms these OS's use are not the simple mem allocs you might expect.
  • DisgruntledGoat
    DisgruntledGoat almost 15 years
    Yeah, restarting your computer is like scraping the burnt bits off the toast and putting it back in the toaster. What you described was reinstalling the OS ;-)
  • David Hayes
    David Hayes almost 15 years
    Ok, imagine you have a whiteboard where you've space to write 5 things you need to do. Every so often you scrub out a task you've completed and replace it with a new one. Now say you accidently pick up a permanent marker rather than a water soluble one to write your new task. When you come to scrub out this task you can't until you wipe the whole board clean with some alcohol. Restarting you computer is "the same' as wiping the board clean, it removes all the "stuck" code
  • Iain Samuel McLean Elder
    Iain Samuel McLean Elder almost 11 years
    Thanks for the link to Raymond Chen's article. I don't think your summary accurately reflects the author's views. He doesn't say it should be this way. He concludes: "So it's not that Windows has to restart after replacing a file that is in use. It's just that it would rather not deal with the complexity that results if it doesn't. Engineering is a set of trade-offs." It makes me wonder: What trade-offs did the Linux developers choose? (Linux is noted for requiring a restart less frequently.) Do they deal with the complexity, or do they just break things?
  • Zac B
    Zac B almost 11 years
    This is opinion, but a few things come to mind: Linux systems that upgrade libraries in-place can often leave other programs running that are linked to old versions of those libraries. There are a lot of systems that try to prevent this, but the complexity discussed in the Microsoft post is still present and isn't always abstracted away, so library-versioning bloat is something that occurs often, for better or worse.
  • Zac B
    Zac B almost 11 years
    Linux also tends towards a more strict regime of dependence modularity, rather than proliferating "used by everything ever" libraries. Those still exist (as do problems caused by in-place upgrades leading to reload-related problems), but are less prevalent than on Windows. IMO, a lot of that reduced prevalence has to do with Windows being developed in a much more agglomerated way (with a persistent goal of backwards compatibility) than Linux, which has an architecture that is, if not more consistent, usually interacted with in a more consistent way.
  • Zac B
    Zac B almost 11 years
    TL;DR: Linux often makes the tradeoff in favor of the rigor and development time necessary to engage with the complexity you mentioned. Having a modular, consistent architecture helps as well.
  • emallove
    emallove over 10 years
    Continuing the analogy contest, you can try to gather the spilled milk back into the cup or you can pour yourself another glass of milk.