Power is an essential part of data center operations. Data centers must make sure enough power is always available. This has always been a challenge, but the challenge has grown as data center operators must not also contend with delivering far more power per square foot of data center space -- and delivering it more efficiently. As a manufacturer of integrated data center power monitoring systems we find ourselves fielding many questions regarding electrical power distribution. In this three-part series we will cover pretty much everything you need to become a well informed consumer of electrical power.
- In Part 1 we’ll cover the basics of electricity, Volts, Amps, Watts, Watt-hours and of course dollars.
- In Part 2 we’ll cover AC vs. DC and the mysteries of the power factor.
- In Part 3 we’ll talk about one-phase vs. three-phase circuits.
Know your Power, Part 1 - The Basics
Electricity is the common name for electrical energy. What it "really" is, is the flow of electrons through a conductor, usually a copper wire. Whenever electricity flows to a device, the same amount has to come back - it is a "closed loop" system. Electrons in a wire actually move pretty slowly (NOT at the speed of light). Signals DO travel at (close to) the speed of light. Imagine a 100ft pipe, filled with water: when you open a valve on one end, the water almost immediately flows on the other end, even though no drop of water has traveled 100ft, but the pressure wave has.
Measured in Volts (V), after Alessandro Volta. This is the "pressure" of electricity. Data centers typically draw power from the utility grid at high voltage, typical 480V, which must then be transformed to a lower voltage for use by the IT equipment. In North America, most IT systems found in a data center use 110V, 208V or 220V while in much of the rest of the world 220V to 240V is more common. Voltages within about 10% are used interchangeably, so you may hear the same installation described as 110V, 115V or 120V. Electrical voltage alone, just like water pressure this does not really tell you how much "work" (power) a system can deliver. Imagine a tiny tube: it could deliver water at tremendous pressure, but you couldn’t use it to power a waterwheel.
Measured in Amps (Amperes, after Luigi Ampere) (A). This is the "flow rate" of electricity (how many electrons per second flow through a given conductor). Just like with water - this tells a story of volume, but not pressure, so on its own it doesn’t tell the full story of power. Imagine a big pipe: you could have a lot of water flowing through it, but the energy it carries depends on its pressure. Higher currents require thicker, more expensive cables. The main power feed to a data center can be thousands of Amps and is then distributed and reaches the individual cabinet at 20A to 63A.
Measured in Watts (after James Watt) (W). This is the useful work being done by electricity (just like horsepower). Watts reflect work being done at a given moment, NOT the energy consumed over time. Power in Watts is calculated by multiplying voltage in Volts times current in Amps: 10 Amps of current at 240 Volts generates 2,400 Watts of power. Notice that this means that the same current can deliver twice as much power if the voltage is doubled: that’s one of the reasons higher-voltage data center power is becoming more popular.
Power consumed (i.e. Energy)
Measured in Watt-hours (Wh). Watt-hour is the amount work done (i.e. energy released) by applying a power of 1 Watt over 1 hour. A 100 Watt light bulb left on for 10 hours will consume 1000 Wh (or 1 kWh) of energy.
You generally pay for power by kilo-Watt-hour (kWh), or 1,000 Wh. The cost in the U.S. ranges from $0.03 to $.20 and above per kWh and is much higher than that in many parts of the world. A server that uses 500W running for a year will consume 500W * 8,760 hours (in a year) = 4,380,000 Wh = 2,190 kWh. If you are paying $0.20 per kWh that will cost you 2,190 * $0.20 /kWh = $876 per year. This of course doesn’t include the cost of cooling this server.