Metrics & Measurements

White Paper

Return to Main

Comment Policy

The comment option is to provide feedback on this particular deliverable. Any advertisement of products will be removed.


WP#06-The Green Grid Data Center Power Efficiency Metrics: PUE and DCiE

Posted In:

Metrics & Measurements




Share

Public Users

Non-Members must  in order to access (at no cost) publicly available technical content from The Green Grid. If you have already registered, please  to access this content.

The Green Grid will not sell, rent or share your information with third parties. Refer to our Privacy Policy for more information on how we may use registration data.

Members

A user account is required to access much of the technical content on The Green Grid website. If you have already registered for an account, in order to download this content.

New users may register for an account. To check if your company is a member, please review the Members List.

_______________________________________________

Please note: PUE: A Comprehensive Examination of the Metric supersedes all prior white papers and consolidates all of the new and previously published content that The Green Grid has developed related to PUE.

 Editors:
Christian Belady, Microsoft

Contributors:
Andy Rawson, AMD
John Pflueger, Dell
Tahir Cader, Spraycool

The Green Grid is an association of IT professionals seeking to dramatically raise the energy efficiency of data centers through a series of short-term and long-term proposals. This is an update to the very first white paper published by the Green Grid in February 2007 called "Green Grid Metrics: Describing Data Center Power Efficiency" to refine the nomenclature and intent of that paper. In that paper, The Green Grid proposed the use of Power Usage Effectiveness (PUE) and its reciprocal, Data Center Efficiency (DCE) metrics, which enable data center operators to quickly estimate the energy efficiency of their data centers, compare the results against other data centers, and determine if any energy efficiency improvements need to be made. Since then, PUE has received broad adoption in the industry but DCE has had limited success due to the misconception of what data center efficiency really means. As a result, this paper reaffirms the use of PUE but redefines its reciprocal as data center infrastructure efficiency (DCiE). This refinement will avoid much of the confusion around DCE and will now be called DCiE.

 

PUE clarification

Is PUE also applicable for Telecom Equipments? Can I estimate what would be my new PUE if I upgrade my CRACS?

Posted at 12:22 AM on May 23, 2013 by roger amper

RE: EUE instead of PUE

Hello DK,

Thank you for your post. We understand that the air conditioning load can be a large portion of a data center’s energy use. Please see the subsequent PUE related white papers, specifically White Paper #22 (Usage and Public Reporting Guidelines for The Green Grid’s Infrastructure Metrics (PUE / DCiE) for further direction where The Green Grid recommends PUE to be measured in energy. Although the metric is named Power Usage Effectiveness from the original publication, the measurement can be performed in energy or power depending on one’s need. Provided the proper nomenclature is used as outlined in White Paper #22 everyone can distinguish between a PUE that measured energy or power.

Thank you,
TGG

Posted at 09:11 AM on September 20, 2012 by TGG Admin

EUE instead of PUE

It can be very well noted that major component in data center is Air – conditioning load which is depend upon climatic zone i.e. ambient temperature and vary season to season and also during day. Hence considering this aspect we may adopt Energy Utilization Effectiveness EUE = Total facility Energy consumption / Exchange equipment Energy consumption Considering Energy Consumption Instead of Power will take into account time factor, as Power measurement is single-point efficiency measurement.

Posted at 03:58 AM on September 19, 2012 by D K SATHE ...... dksathe@gmail.com

IT versus accountants, who will win?

So when you get down to the nub of it, it's about what temperature to operate IT equipment at. Getting the most bang and the most bang for the buck from HVAC equipment including plate and frame heat exchangers, heat recovery chillers, etc. is very old news to engineers. What's new is operationg IT equipment at 95 degrees F and above, in fact as high as 115 degrees F. IT people used to tell engineers and plant operations personnel that operation above usual ambient temperature, say 68 degrees F caused unacceptable increases in data processing errors and shorter MTBF. Design engineers had a tough time designing to those temperatures when loading per rack started creeping well up above 3 or 4 kw. As power density increased even with 55 to 60 degrees inlet temperature equipment at the bottom of the rack, air discharge temperature at the top of the rack could reach 100 F. Beyond hot aisle-cold aisle conventional HVAC installations, in-row and in-rack cooling including circulating liquid systems have become the drastic measures necessary to meet traditional target expectations. This can compromise reliablity due to failure of compartmentalization among other factors but that's the price that's often paid. Are facilities owners aware of the tradeoff or are the mesmerized by opearational cost savings? Now with IT equipment operating at much higher temperatures what's acceptable to data hardware personnel is OK for facilities design engineers too, so long as we don't have to be the ones to work in them once they're turned on. The term efficiency used as the reciprocal of PUE is a most unfortunate perversion of the term. Engineers think in terms of efficiency of HVAC systems compared to a Carnot cycle, the most efficient system possible. Here the term refers to power expended on non IT utilization compared to power required for IT equipment. Without much more detailed information to explain it such as acceptable maximum in-rack operating temperature and detailed climatic data it is impossible to legitimately compare one data center's PUE with another. Correction factors to normalize measured PUE to a fixed reference by negating these variables would make the measurement much more useful than it is so far.

Posted at 01:33 PM on April 05, 2012 by Mark, Electrical Engineer

IT versus accountants, who will win?

So when you get down to the nub of it, it's about what temperature to operate IT equipment at. Getting the most bang and the most bang for the buck from HVAC equipment including plate and frame heat exchangers, heat recovery chillers, etc. is very old news to engineers. What's new is operationg IT equipment at 95 degrees F and above, in fact as high as 115 degrees F. IT people used to tell engineers and plant operations personnel that operation above usual ambient temperature, say 68 degrees F caused unacceptable increases in data processing errors and shorter MTBF. Design engineers had a tough time designing to those temperatures when loading per rack started creeping well up above 3 or 4 kw. As power density increased even with 55 to 60 degrees inlet temperature equipment at the bottom of the rack, air discharge temperature at the top of the rack could reach 100 F. Beyond hot aisle-cold aisle conventional HVAC installations, in-row and in-rack cooling including circulating liquid systems have become the drastic measures necessary to meet traditional target expectations. This can compromise reliablity due to failure of compartmentalization among other factors but that's the price that's often paid. Are facilities owners aware of the tradeoff or are the mesmerized by opearational cost savings? Now with IT equipment operating at much higher temperatures what's acceptable to data hardware personnel is OK for facilities design engineers too, so long as we don't have to be the ones to work in them once they're turned on. The term efficiency used as the reciprocal of PUE is a most unfortunate perversion of the term. Engineers think in terms of efficiency of HVAC systems compared to a Carnot cycle, the most efficient system possible. Here the term refers to power expended on non IT utilization compared to power required for IT equipment. Without much more detailed information to explain it such as acceptable maximum in-rack operating temperature and detailed climatic data it is impossible to legitimately compare one data center's PUE with another. Correction factors to normalize measured PUE to a fixed reference by negating these variables would make the measurement much more useful than it is so far.

Posted at 09:49 AM on April 05, 2012 by Mark, Electrical Engineer

PUE levels not reported

In what I have reviewed thus far, in all of the data center literature that addresses energy efficiency, the PUE level is generally not indicated. So unless the level can be determined from the text, it is not known what the level is. As such, it can only be taken as a relative measurement within the context of the reporting organization. Comparing data centers to each other becomes impossible and defeats one of the purposes of developing a standard.

Posted at 08:26 AM on December 01, 2011 by Robert Lundquist, MnTAP

PUE levels not reported

In what I have reviewed thus far, in all of the data center literature that addresses energy efficiency, the PUE level is generally not indicated. So unless the level can be determined from the text, it is not known what the level is. As such, it can only be taken as a relative measurement within the context of the reporting organization. Comparing data centers to each other becomes impossible and defeats one of the purposes of developing a standard.

Posted at 08:23 AM on December 01, 2011 by Robert Lundquist, MnTAP

Games with numbers

You can redefine terms to play games all you like and governments and industry can fool themselves and many others but until you can repeal the laws of thermodynamics or change the methods accountants use to determine whether or not an investment makes good business sense you haven't really done anything of value. The facts remain the facts and engineers and accountants who know how things really work won't be fooled for one second when they care to take the trouble to examine bogus claims. There are no free lunches. Compromising the acceptable criteria for maintianing the data center envelope does not constitute increased energy efficiency, it just makes the operation cheaper and the data center less reliable.

Posted at 06:20 AM on November 22, 2011 by Mark

Re: PUE=1.0

Hello Mark, Thank you for your concerns. PUE has been widely adopted globally and continues to be valuable in the industry. PUE is one of many metrics but it is the only one consistently used by companies, industry groups, and government agencies to measure the IT to Infrastructure ratio of data center energy consumption. The Green Grid is dedicated to finding the best methods to drive the industry toward energy efficient IT and to be the global authority on resource efficient data centers and business computing ecosystems. Thank you, The Green Grid Administration

Posted at 09:36 AM on October 26, 2011 by kavi\admin

PUE=1.0

You can get PUEs of 1.0 IF...you build your data center out in the open in Antarctica. You'll get 100% free cooling all year round and sufficient ambient light for half a year. For the other half when it's night you can issue miner's helmets with battery operated lamps. Just pray it doesn't snow and the ice doesn't melt. Clearly PUEs are strongly related to the extent of using free cooling and the local climate. The only other significant factor is heat gainor loss at the envelope perimeter. Improvements of real efficiency in mechanical cooling systems as scientifically defined only offer thermodynamically marginal gains.

Posted at 02:29 PM on October 12, 2011 by Mark, Electrical Engineer

Calculation of PUE

The solar energy would not reduce the PUE as described. The solar energy would be part of the total energy, and would be listed in the numerator. As an example if a DC procures electricity from the grid and solar energy from an on-site implementation, the total energy for the PUE numerator would be total electricity from the grid + total electricity from solar. The reason total energy includes all energy inputs is due to the intent of the PUE metric. The PUE metric is not intended to evaluate how efficiently one brings energy to the data center, the metric evaluates how efficiently the energy is used within the data center.

Posted at 12:54 PM on June 13, 2011 by Dan Azevedo

calculation of PUE

if DC has a solar panel that can provide power to the DC. will the energy generatred from the solar panel reduce the PUE number? in other word is this formula valid? PUE= (total energy used-solar energy generated)/(IT energy used)

Posted at 11:41 AM on June 13, 2011 by vincent vliu@telamon.com

" Is PUE of a Data Center affected by redundancy "

Will the PUE of a Data Center change or affected if instead of One source of Power , we are using a redundant source of Power, like Feed A and Feed B ?

Posted at 12:13 AM on April 20, 2011 by DD

Re-using Sound Energy

The best thing I know for converting sound (vibration) energy into electricity is piezo ceramic. It is used in the wing tips of F-16 fighters and the ski tips of K2 ModX skis to dampen vibrational energy and convert it to electricity or heat. However, even as noisy as data centers sometimes are, very little of the total energy used in a data center is lost to noise. Think of it this way, it doesn't take much electricity to generate a lot of noise with a stereo. So even thought it could be made to work, it probably wouldn't be the most effective use of the investment.

Posted at 12:15 PM on February 15, 2011 by Lance

Additional info on power efficiency

Thank you for your question relating to cooling efficiency in the data center. The white paper refers to a metric called PUE (or DCiE). The metric is defined as:PUE = Total Facility Power / IT PowerMany data centers are coming in with PUEs around 2.04 (the latest EPA data). This means that for every 1W consumed at the IT equipment, e.g., a server, 1.04W additional power are consumed to deliver power to and cool that piece of IT equipment. It gets a little more complicated to separate the power distribution losses from the cooling system power consumption, but this example gives the general idea of what is involved. Of the additional 1.04W, most of this power will go to running the cooling system.

Posted at 11:59 AM on August 21, 2009 by Tahir Cader, HP, member of The Green Grid

Becoming a Member

Thank you for your interest in joining The Green Grid. This is an exciting time to get involved in The Green Grid and support our efforts to improve energy efficiency in data centers. Please visit the Become a Member tab to learn more on how to join. We have also sent you a follow-up email for more information. Again, thank you for your interest in engaging with The Green Grid. We look forward to working with you and are happy to answer any questions you may have about The Green Grid’s initiatives and membership in the organization.

Posted at 02:09 PM on July 27, 2009 by The Green Grid Administration

PUE calculation with CHP

Hello, I would like to know how to perform de PUE determination for a Data Center with CHP associated.Thanks

Posted at 12:22 PM on February 19, 2010 by Alvaro Santos

PUEs for Enterprise Data Centers

Thank you for your question relating to PUE values for Enterprise Data Centers. Assuming that the sites reporting these values are following the reporting and usage guidelines recommended by The Green Grid (see associated white paper on The Green Grid website), values as low as 1.2 are achieved through a number of ways:1. Free air-cooling, also implemented as air-side economization.2. Water-side economization.3. Highly efficient power distribution path, e.g., highly efficient UPSs.In general, being able to cool without the benefit of a chiller seems to have one of the largest impacts on driving PUE down. Of course, there are a number of other practices that can be implemented, including water-cooling. The latest EPA data for PUE shows an average PUE of 2.04. Unfortunately, I have not seen a breakdown of the data center Tier levels for the data reported by the EPA.

Posted at 04:04 PM on August 28, 2009 by Tahir Cader, HP, member of The Green Grid.

information

helloi am hamed salehii live in asiai need to know how can i join as a grid memberand how much do i earn?www.hamedsalehii206@gmail.complease answer me as soon as possible

Posted at 08:04 AM on July 14, 2009 by hamed salehi

what is best efficiency

I was reading another site where it indicates that some of the top enterprise class data centers may have a PUE as low as 1.2. Is that a realistic number? If so, are these data centers using some new type of water cooling technology that is more energy efficicient? The EPA predicts that data center power usage may double between 2006 and 2011. That EPA report is a couple of years old. Does it still hold or are we more efficient than that now.thanks,

Posted at 03:02 PM on August 28, 2009 by Joyce Johnson

additional info on power efficiency

Just for my benefit, it would be interesting to see how much energy is expended to cool a data center. For instance, if the actual work from a server requires 1W of power, how many additional watts are needed to remove the heat dissipated from that particular server?thank you

Posted at 09:49 AM on August 20, 2009 by Joyce Johnson

Independent Consultant

Raising the data center drybulb set point theoretically increases the PUE in two ways. It reduces the energy used by the cooling system to meet the less stringent requiement and it increases the IT power draw by making the server fans operate at higher speed to keep the internal temperature within acceptable limits. Both improve PUE but one reduced energy consumption and the other increases it.

Posted at 03:55 PM on March 25, 2010 by Glen Goss

effectivity with efficiency

What are the differences between effectivity and efficiency?

Posted at 10:20 AM on February 25, 2010 by

vision

I'd like to know the difference between mission & vision?

Posted at 10:25 AM on February 25, 2010 by

Cambridge Elean Data Campus a Vision of the Future

At Cambridge we are building 700,000 sq ft ( 65,000 sq M) and utilising existing technology, Elean reduces the energy footprint from 65 MG to 34MG (52%), actual utility bills by 27% guaranteed for 25 years. The project will be exempt from CRC Taxing, this further increases the energy costs. This is achieved by on site Combined Heat & Power, and all base chilled water generated by Absorption Cooling. Through this unique model the site PUE will be below 1. Energy Cost savings across the site will exceed £1.4B, over 25 years which makes no allowance for proposed CRC Levy. See eleandata.com for technical information

Posted at 01:54 AM on October 21, 2009 by ALEX J Arthur CEO BNB Developments Ltd

Reusing sound energy in a data center

I would like to know if there are ways in which we can reuse the sound energy in data centers. May be use a transducer or nanotechnology to convert the sound energy into electrical energy?

Posted at 06:43 AM on September 09, 2009 by Jyothi

Comment On This Document


Rate This

survey

Thanks for visiting The Green Grid website! We are always working to improve our site and would appreciate your input.

Please click here to take a brief website survey.

No Thanks