How do I calculate cooling cost per server?

18,240

Solution 1

  1. You need to convert BTU to watts.
  2. Convert Watts to kilowatts
  3. Then multiply the kilowatts by whatever you currently pay the electric company per kWh.
  4. $$$ Profit $$$

You should have two sets of numbers: the electricity used by the systems (880W in your case) and the electricity used to cool them (convert Tons to BTU to Watts to kWh), then add them up.

You not only have to account for the electricity you use to power the systems, but also the electricity used to cool them.

You computed the cooling you need not necessarily the cooling you provided. These can be two separate numbers. If your ambient temperature was around 70F these numbers are close enough.

Solution 2

Here's the easy way. You can do a lot of fancy math, but in the end analysis, the amount of electricity it takes to cool the equipment is equal to the amount of needed to power it. There is solid scientific rational behind this in the bowels of APC's web site, if you're curious.

So, your 880W load with cooling would be 1760W total. From here it's clear sailing. Let's say your kilowatt-hour (kwh) rate from the utility company is $0.17/kwh.

Your annual kwh load will be 1760 watts * 24hrs/day * 365 days/hr / 1000 watts/kw = 15,418 kwh hours per year.

15,418 kwh * $0.17/kwh = $2,621/yr

Solution 3

You need to know how much your AC unit draws in wattage/ton or wattage/hour to calculate your cooling costs.

You may not have access to that data easily. You can if you're shopping around usually find the draw of an AC unit on it's spec sheet. Then you just have to multiply it out to get cost over a week or hour or whatever.

Ambient air temp in your data center is almost irrelevant unless you're talking about a tiny number of servers. When our AC fails our room goes from 68-90 in about 20 minutes. I mean if your ambient air temp is a frosty -10F then sure, it can be helpful (that's how we replaced our AC units last winter, by piping in Minnesota January air).

I am guessing you're trying to put an ROI on your virtualzation, to prove it is cost effective. So calculate the cooling load of your virtual server, which I presume you added just to remove the physical ones. Subtract that from the cooling load you calculated above for the servers you removed. Now you know your net gain in cooling load.

Then calculate the draw of your AC unit per ton or per hour, and use that to figure out your cost in dollars per ton or hour.

Solution 4

.880 * kwhrate = $/hr

So at $.10/kwh, you'd be spending 8.9 cents per hour.

Share:
18,240

Related videos on Youtube

Kara Marfia
Author by

Kara Marfia

"We teach what we need to learn. We preach what we need to do. We write what we need to know." - E. Taliaferro Writing the Perfect Question

Updated on September 17, 2022

Comments

  • Kara Marfia
    Kara Marfia almost 2 years

    I'm attempting to figure out the power/cooling draw for some old servers that I'd virtualized.

    Starting with info in this question, I've taken the 880W the old boxes appeared to be drawing (according to the APC logs), and gotten 2765 BTUs/hr, or 1 ton of cooling every 4.34 hours.

    At this point I'm scratching my head as to how to figure the cost for cooling. I'm sure that depends on what I'm using to cool. I'd imagine that ambient temperature counts for something at this point, but I'm not sure if it's significant.

    [edit] What I'm missing - I believe - is any idea of what 1 ton of cooling costs. Is that a sensible thing to shoot for, or am I barking up the wrong tree? [edit]

    In any case, any pointers on what info to gather next, what to do with it, or what's wrong with the above figuring (if applicable) is most welcome.

    • Admin
      Admin about 15 years
      For the curious, my 4-server consolidation cut power usage by 79%, using the methods below. Replacing 10 year old hardware makes for some pretty results.
  • Kara Marfia
    Kara Marfia about 15 years
    Hrm, I started with 880W, and took Watts * 3.143 to get BTU, then 1 ton of cooling per 12,000 BTU to find out (I thought) how much cooling was required to offset the heat generated. Did I get turned around?
  • Joseph Kern
    Joseph Kern about 15 years
    Ah. I see. Updated my post, a little more clear. I hope.
  • Kara Marfia
    Kara Marfia about 15 years
    Lack of experience on my part, not clarity on yours, I trust. ;) The conversion from watt -> BTU -> back to watt threw me for a loop, but I'm converting from Watt-hours to Watts. Thanks for the guidance!
  • Laura Thomas
    Laura Thomas about 15 years
    Well that would calculate the amount of power saved by virtualizing the servers, but wouldn't account for the cost in reduced cooling needs for the room.
  • Lance Roberts
    Lance Roberts about 15 years
    Actually, it would as per Scott's post. Joseph's post put's the total cost together well. (and your post put it well also).
  • AaronLS
    AaronLS over 14 years
    How do you come up with the BTU you use in step 1? I am trying to calculate the cooling cost of a single HDD.
  • Joseph Kern
    Joseph Kern over 14 years
    You need to take the Watts needed for the operation of the device. 1 watt-hour is equal to 3.413 BTU ... click the link in the first step (scroll down a bit).
  • AaronLS
    AaronLS over 14 years
    That calculator is just converting the units which is like saying 100% of the wattage is converted to heat, and when converting back from BTU to watts in step one, this is like saying you have a perfectly efficient A/C unit. Since you are just converting the units, why not just use the original wattage of the device in step 2. We could consider using a typical SEER rating to get a better idea of the cooling cost.
  • Hanno Fietz
    Hanno Fietz about 14 years
    And actually, the scientific rationale behind it is just as easy: energy is never destroyed, or lost, so all you do is bring in energy as electricity and take it out as heat, after it has done its work.