Save to My DOJO
Dynamic Memory is one of Hyper-V’s most misunderstood and underutilized technologies. Many people believe that it’s not working when it’s doing exactly what it’s supposed to. Too many won’t use it at all based on incorrect assumptions. Most don’t understand the conditions in which it will operate. Unfortunately, there’s really not a simple guide to using it properly, or you’d find articles on it everywhere. If you want to squeeze the most out of your virtual environment, you’re going to need to get your hands dirty with some of the grease that’s down in the guts of your systems. Learn more about Hyper-V memory.
Dynamic Memory is the Key to Density
Out of all the resources controlled by Hyper-V, memory is the most precious. CPU resources can be shared at very high ratios. On its own, disk space can’t be shared, but it’s become cheap enough that it’s not much of an issue. Technologies such as dynamically expanding disks and deduplication mean that you can squeeze more out of it in a virtual environment than in a traditional environment. When you can’t get more out of your disk space, it’s the least expensive resource to scale up and out.
Hyper-V Memory, though, is much harder to handle. Basic levels of memory in a physical server, such as 16 and 32 GB configurations, is pretty cheap, and more than enough for the majority of server applications. Assigning 16 GB or more to virtual machines can quickly become prohibitively expensive. That’s because scaling up memory in the same host increases exponentially. Scaling out means buying more server hardware, which means buying more licenses.
Dynamic Memory can be used to reduce your costs by allowing for greater virtual machine densities. The trick is in figuring out how to properly utilize it. While proper Dynamic Memory configuration could be a scientific endeavor, the difficulty of acquiring the necessary information forces you to employ some art.
Software Vendors Usually Aren’t Much Help
The biggest barrier I find when trying to figure out how to configure Dynamic Memory for a server application is the maker of that application. It’s possible to get an idea of how a program will use memory, but precious few vendors put forth the effort. Many will test in a few various configurations and then publish those as their requirements. The ones I commonly deal with do their testing in a physical environment and then just tell their customers to replicate those settings in a virtual environment. Many will demand that you disable any memory management techniques. Unfortunately, few of those who take that angle ever actually perform any serious degree of testing. The real problem you need to watch for here is that, even if you can demonstrate that their applications work fine under Dynamic Memory, they’ll blame it for any problems that occur… no matter how obviously unrelated. It’s usually not worth risking losing complete support, so, I always counsel that you follow their requirements, no matter how absurd.
But, sometimes they will help you. Sometimes they’ll just let you lead, which is better than being refused outright. Other times, you can piece together a plan out of clues that they give you. Some software vendors just give you a block number of RAM that they need and nothing else. Others will give you scaling information. For example, a traditional client/server application may indicate that it needs 500MB to run and 30MB per user. Such an application has had its memory usage properly profiled by its developers, and while it may or may not actually work with Dynamic Memory, you’ll at least be able to properly design the memory that its virtual machine uses.
How Software Works with Memory
To properly plan a deployment, it helps to understand the way that software uses memory. Unfortunately, having a really good understanding of that does require some experience with programming. So, you’re about to get a crash course.
Static Memory Allocation
Some applications use a fixed amount of memory in all situations. For very small applications, like monitoring agents, that’s expected and not a concern. Dynamic Memory will make no difference for those types of applications, although I’d assume that you have something besides an application like that on any given server.
I’d like to say that larger-scale applications that use memory this way aren’t found on servers very often, but I’ve learned through the years that far too many line-of-business applications become the top choice in their industry through lack of competition, not any particular competence in software development. However, applications of this kind probably won’t be harmed by Dynamic Memory, provided that there is sufficient memory within the virtual machine when they start up. They also won’t be helped by Dynamic Memory. Using it in their guests will have the most impact on squeezing extra out of the slack between the application and the operating system.
Static Memory Allocation Based on Available Memory
High performance applications that require a lot of memory for normal operation will often take the approach of consuming as much memory as they can. Notable members of this group are Microsoft’s SQL and Exchange servers. These are extremely poor candidates for Dynamic Memory. For one thing, they are memory-intensive server applications, so it doesn’t make any sense to assign any less memory to their virtual machines than is necessary for them to perform their roles. A quality application of this nature will have published documentation and/or assistance from the manufacturer to help you architect memory allocation.
If, for any reason, you choose to use Dynamic Memory with an application such as this, it’s very important that you set the Startup memory to a number that ensures the application will have enough when it checks. Setting the Maximum is insufficient, as the application will not know about that memory.
Generic Memory Allocation
Most applications do not have a strict memory management policy, and this will include server applications. How they get their memory depends on a few factors. You will not always have access to sufficient knowledge about the architecture of the application you’re dealing with in order to determine how these factors are in play. But, you can usually figure out some of it, and knowing about the options might help you make sense of the behavior that you witness. If you’re really interested, I would suggest you spend some time delving into things like stack memory and heap memory and the differences between managed and unmanaged code. If you understand them, you’ve got 90% of the tools you need to understand how Dynamic Memory will operate in any given situation.
The quick summary is that most modern application developers do not become overly concerned with memory management. It’s not because they’re lazy, but because modern operating systems have very powerful memory management techniques built-in, and there’s rarely any value in trying to outdo them. These applications usually ask for memory when they need it and release it when they’re done. You can spend some time monitoring its behavior, and build a Dynamic Memory solution that fits.
Combining this Knowledge into a Strategy
The first and most important thing to do is work with the application developer. In theory, they know their application better than anyone else (in practice… well…). If you’re really lucky, they’ll have put meaningful testing into their application in a Dynamic Memory environment and will be able to provide useful guidance. If you’re really unlucky, their application will work fine with Dynamic Memory, but to cover up their laziness, they won’t test it and will demand that you avoid Dynamic Memory or lose support. Most of the time, you’ll be somewhere in the middle. You’ll be given some vague recommendations and set loose.
I’d start by trying to determine if you’re working with an application that dynamically allocates and deallocates memory. Use Performance Monitor to track the application’s memory usage. I would test with a few different values of Startup memory first. If you find that it always uses a particular percentage of memory, then it’s probably performing static allocation based on the amount of memory that it finds, and therefore won’t be the best candidate for Dynamic Memory. Otherwise, it’s just a matter of locating the “sweet spot” for Dynamic Memory. For this, I generally find an acceptable Startup memory that allows the guest to start up quickly while still leaving sufficient memory for all other virtual machines to start from a power-off state (remember that the total of all Startup values must be available when the host boots, or there will be guests that will not start). Once you have the Startup value where you want it, the next thing is to fiddle with the Minimum and Maximum values. I like to start with both of these somewhere near the point that I expect. You can lower the Minimum and raise the Maximum while the guest is on, but anything else requires you to shut it down.
You likely want to stay away from drastically different minimums and maximums in a single virtual machine. If an application displays especially erratic behavior, target its higher end, not its lower end. By that, I mean you should be willing to raise the minimum up toward where you see the application peaking. This is because, when the VM’s usage dips very low, that makes a lot of RAM available for other guests. If they choose to use it right when the application wants to have a high peak, you could see significant performance degradation from that application because its demand is significantly higher than what is available. If you simply ensure it has a reasonable floor at all times, these radical changes won’t occur. Of course, if these peaks and valleys occur at predictable times, you could certainly develop a strategy that takes advantage of that knowledge.
Ordinarily, the Dynamic Memory buffer setting for a virtual machine isn’t of much concern. One exception is when you’ve got an application that doesn’t use Dynamic Memory often and you’re trying to tweak for maximum density. The reserve isn’t guaranteed, but Hyper-V will provide it if it can. By setting a low reserve for a VM that isn’t likely to increase its demand, more is available for rapid allocation by other guests. The second exception is for VMs with very large or very small normative memory values. The reserve is set as a percentage. The actual value used will be that percentage of memory that the VM is currently using. So, if a VM currently has 10 GB in use and a 10% buffer, that’s an entire gigabyte of memory doing nothing, waiting to be used. If you consider that to be a lot, lower the buffer. Don’t over think this; remember that Hyper-V doesn’t guarantee the buffer. It’s just that if you have a great deal of memory in reserve and any VM could use a few extra MB, you don’t have an optimal configuration. For extra tweaking, you can also use the weight setting to give Hyper-V hints on how it wants to solve any dilemmas between all these varying buffer settings and demand conditions.
Even if you have a great deal of memory in your host, stay away from using high maximums. This is because far too many applications exhibit memory leaks. When an application requests memory in a certain fashion, the operating system grants it and allows the application to manage it from that point onward. Sometimes, an application might create an object and then lose track of its memory (it’s an easier mistake to make than you might think). There is no way for that memory to be freed until the application is closed. Each time the function that creates that object is called, that same amount of memory is leaked, and the total memory in use by the application just continually climbs. Inside a guest using Dynamic Memory, this would cause the memory demand to continue to increase. If the upper bound is too high, that could cause other virtual machines to be subjected to memory starvation. Even if an application doesn’t leak today, any patch could introduce a problem, so you definitely want to spend the time to determine reasonable maximums for all your guests.
Dynamic Memory and System Buffers
Occasionally, I run across people talking about the various system buffers in use by Windows (as a guest). They are concerned that Dynamic Memory causes Windows to allocate too little memory for these buffers, which negatively impacts performance.
The first thing I want to say is that I’ve never personally encountered this problem. Therefore, I can’t even definitively say the problem exists. It’s not like there is any shortage of people out there managing their systems in all sorts of odd ways due to equally odd superstitions. However, I’ll freely accept that it could be a problem, and I’ve just been able to avoid it. If that’s the case, the reason I’ve been able to avoid it is because I spend the time monitoring my systems and finding good numbers where they perform adequately. In all truth, in the absence of any external guidance, I start with 512 MB Startup, 512 MB Minimum, and 2 GB maximum. If no one howls, I leave it be with default monitoring. If a machine has consistently high demand or noticeably poor performance, that’s when I start the tweaking process. My point here is that, if you turn on Dynamic Memory and there is something that’s not quite right, your first answer shouldn’t be to jump to a “never again” attitude. Even if you can’t make everything work with Dynamic Memory, you can certainly make some things work with it.
For these, and other situations where Dynamic Memory seems to not be allowing proper allocation of memory, your best bet is to raise the Startup value. This is because the Startup value is all that the guest knows about until such time as Dynamic Memory assigns it more. So, if you set 512 MB for Startup and 2 GB for maximum, the guest will boot up believing that it’s only got 512 MB of memory installed. If Windows allocates buffers based on how much memory it has at Startup and never recalculates, then this is the number that calculation is based on. If it’s not working for you for some reason, then just raise the Startup. You can keep a lower Minimum; this sort of thing is exactly why Startup and Minimum are different settings in the first place. You could, for example, leave the Minimum at 512 MB and raise the Startup to 1 GB. Then, Windows and any other Startup applications will begin life calculating against that maximum. If they don’t use that memory, then Hyper-V will use its balloon driver to reclaim some of that memory for use in other guests. Problem solved.
Share What You Learn
I’ve written an intentionally generic guide here. As you discover working settings for specific applications, other users would gladly benefit from your knowledge. Share it here, share it on user and community forums, share it in user group meetings. Something that really helps people adopt technology is successful usage by others. The end result will be a better usage of the physical memory resources we have available to us.
Not a DOJO Member yet?
Join thousands of other IT pros and receive a weekly roundup email with the latest content & updates!