Why not turn on "Enable Adaptive Hypervisor” in Parallels 11

6,193

Its in their own docs:

If you use a lot of applications in the primary and guest OSs at a time, you may lack the CPU resources. The current version of Parallels Desktop presents you the Adaptive Hypervisor technology that helps you to distribute the CPU resources between the primary and guest OS in the most efficient way.

The Adaptive Hypervisor technology automatically allocate the host computer CPU resources between the virtual machine and primary OS applications depending on what application you are working with at the moment. If your virtual machine window is in focus, the priority of this virtual machine processes will be set higher than the priority of the primary OS's processes and as a result more CPU resources will be allocated to the virtual machine. If you switch to the primary OS window - the priority of its working applications will be set higher and the CPU resources will be relocated to the primary OS.

Normally if you had a quad core, you could assign 1 core VM1, 1 core VM2 and 2 cores for Host OS. This setting changes the balancing on the fly as needed. Its a different way of doing things. Neither is better.

Share:
6,193

Related videos on Youtube

Basil Bourque
Author by

Basil Bourque

Updated on September 18, 2022

Comments

  • Basil Bourque
    Basil Bourque over 1 year

    Recent versions of Parallels Desktop For Mac, the configuration has included a checkbox for “Enable Adaptive Hypervisor”.

    Configure > Options > Optimization > Performance > Enable Adaptive Hypervisor

    Why is this not turned on by default? What is the downside?

    Does this make use of hardware features, such as by Intel in the CPU?

    The online help is vague:

    To set Parallels Desktop to automatically optimize performance for Mac OS X or Windows depending on which application or program you're working with at the moment:

    Select Enable Adaptive Hypervisor. When you're using a Windows program, more resources are given to Windows, and when you're using a Mac OS X application, more resources are given to Mac OS X.

    So does this feature mean anything more than just “relinquish the CPU while in the background”? If that is all it means, then why not enable this by default? The product seems aimed at regular desktop users who just need to use a Windows app, so it would seem like this feature should be on by default. The fact that it is not the default makes me wonder if there is a downside or risk.