We live in a world of electrical power. It runs our lights, heating, cooling, computers, and machinery. Consider a data center or any large power-hungry facility -- they need power to function and they must ensure enough power is always available. But power isn’t free. Data center managers closely track power as the cost of energy used by a server over its useful life routinely exceeds its purchase price. And most data centers spend twice that amount to cool the servers and remove heat from the facility.
Here's a quick review on the basics of electricity -- Volts, Amps, Watts and Watt-hours. Adding information on cost at the bottom turns this review of the basics into a necessary part of any critical facility manager's job.
Electricity is the common name for electrical energy. Electricity is technically the flow of electrons through a conductor, usually a copper wire. Whenever electricity flows to a device, the same amount has to come back. It is a "closed loop" system. Electrons in a wire actually move pretty slowly, not at the speed of light. Signals do travel at (close to) the speed of light.
Water pipe analogy for understanding electricity
Imagine a 100-foot pipe, filled with water: when you open a valve on one end, water almost immediately flows out the other end, even though no drop of water has traveled the full 100 feet. The pressure wave, however, has traveled 100 feet.
Measured in Volts (V) after Alessandro Volta. This is the "pressure" of electricity. Data centers typically draw power from the utility grid at high voltage, usually 480V, which must then be transformed to a lower voltage for use by the IT equipment. In North America, most IT systems found in a data center use 110V, 208V, or 220V. In much of the rest of the world 220V to 240V is more common. Voltages within about 10% are used interchangeably, so you may hear the same installation described as 110V, 115V, or 120V.
Electrical voltage, just like water pressure, does not really tell you how much "work" (power) a system can deliver. Imagine a tiny tube: it could deliver water at tremendous pressure, but you couldn’t use it to power a water wheel.
Measured in Amps or Amperes (A) after Luigi Ampere. This is the "flow rate" of electricity (how many electrons per second flow through a given conductor). Current describes volume but not pressure, so on its own it doesn’t tell the full story of power.
Imagine a big water pipe: you could have a lot of water flowing through it, but the energy it carries depends on its pressure. Higher currents require thicker, more expensive cables. The main power feed to a large industrial facility could be thousands of Amps. In a data center this gets distributed out so by the time it reaches a rack of servers it is at 20A to 63A.
Measured in Watts (W) after James Watt. This is the useful work being done by electricity. Watts reflect work being done at a given moment, NOT the energy consumed over time. Power in Watts is calculated by multiplying voltage in Volts times current in Amps: 10 Amps of current at 240 Volts generates 2,400 Watts of power. This means that the same current can deliver twice as much power if the voltage is doubled. There is a growing demand for higher voltage transmission lines in part because they make renewable energies like solar and wind more viable. Data centers are also moving towards higher-voltage configurations. Power can also be measured as "real" and "apparent" with a "power factor" that converts one to the other. Learn about power factor here.
Power consumed (i.e. Energy)
Measured in Watt-hours (Wh). Watt-hour is the amount of work done (i.e. energy released) by applying a power of 1 Watt over 1 hour. A 100 Watt light bulb left on for 10 hours will consume 1,000 Wh (or 1 kWh) of energy.
You generally pay for power by kilo-Watt-hour (kWh), or 1,000 Wh. The cost in the U.S. ranges from $0.09 to $0.20 or more per kWh and is much higher in many other parts of the world. You can do the math on what your facility is spending. Here are a few examples.
First, a computer server that uses 500W running for a year will consume 500W x 8,760 hours = 4,380,000 Wh = 4,380 kWh. If you are paying $0.10 per kWh, the cost to run the server is 4,380 x $0.10 /kWh = $438 per year. This doesn’t include the cost of cooling the server which may double or even triple your overall annual cost.
Second, consider a cannabis grow facility. An electricity trade organization in Washington State estimated it takes 2,000 to 3,000 kWh to power the lights to produce one pound of product. Paying $0.10 per kWh works out to $200 to $300 annually per pound.
Finally, let’s look at crypto mining. It’s taking progressively more energy to mine each Bitcoin. As of August, 2021 one estimate put the electricity usage at 143,000 kWh. At a rate of $0.10 per kWh that works out to $14,300 a year for each Bitcoin. This number is different depending on the type of machine doing the computations to mine the Bitcoin.
There's no quiz at the end to test your knowledge. But paying attention to the basics can help avoid unpleasant surprises. Packet Power makes it easier and more affordable for critical facilities managers to track and analyze energy usage. Send us an email to learn how we can help you with your needs.