Every time we use our cell phone or personal computer to query the internet, stream a movie or retrieve photos from the cloud, we’re asking a worldwide network of high-speed computers to locate and deliver data instantly to our device.
Unfortunately, the digital infrastructure that performs these tasks — thousands of computers housed in data centers (aka server “farms”) — also consumes enormous amounts of electricity, much of it from non-renewable sources. To shrink this carbon footprint and create a more sustainable future, large data center owners including Amazon, Facebook and Google are creating green data centers.
How Data Centers Work
A modern data center typically houses thousands of powerful and very small computer servers networked together. This network collects, stores, processes, distributes and allows access to large amounts of data. If you have personal data “in the cloud,” that data is actually stored on a physical computer disk drive in one of these data centers.
A metric commonly used to measure the energy efficiency of a data center is called its power usage effectiveness (PUE), a ratio of its total power consumption divided by the amount of power consumed by its computing infrastructure. PUE was developed by the Green Grid, a consortium of policymakers, technology providers, facility architects and utility companies dedicated to improving IT and data center efficiency.
In a perfect world, PUE would be 1.0, meaning that all of the power consumed by a data center is being used to power its servers. In reality, however, a data center must also provide power to cool its computers, and maintain other aspects of its infrastructure such as facility lighting, heating and cooling. According to the Uptime Institute, the average PUE for data centers in 2019 was 1.67. In recent years, Facebook and other large data center operators have reported PUEs as low as 1.1.
Reducing Energy Consumption
Companies intent on lowering their carbon footprint usually have two major energy goals for their green data centers: (1) reduce the total energy usage of the centers, and (2) increase the amount of renewable energy used by those facilities.
Total energy usage for a data center includes energy consumed by servers and by facility overhead functions such as heating and cooling.
Google has turned to machine learning to help reduce the total amount of energy used to power its data centers. In 2016, a company efficiency expert named Jim Gao teamed up with Google’s artificial intelligence (AI) research group, Deep Mind, to develop an algorithm that could optimize the use of energy at various data centers. The algorithm used historical data from data center sensors — temperature, pump speeds power, setpoints, etc. — to train a collection of deep neural networks to predict the appropriate future temperature and pressure conditions for each facility.
Google claims that its AI algorithm has produced a “40 percent reduction in the amount of energy used for cooling and 15 percent reduction in overall energy overhead.”
The Department of Energy also recommends several steps data center operators can take to reduce the overall amount of energy consumed by their facilities. These suggestions include:
- Server virtualization (using one physical server to conduct the workload of multiple virtual servers)
- Decommissioning unused servers
- Consolidation of lightly used servers
- Better management of data storage (which reduces the number of servers needed)
- Purchasing more energy-efficient servers and other power equipment
Shrinking Water Use
A sustainable future for green data centers also relies on optimizing the use of water involved in creating proper temperature- and humidity-controlled environments for servers.
Historically, data centers have used air- or water-cooled chillers to create the cool air that circulates among racks of servers. This approach, however, is very power- and water-intensive. Chillers, for example, rely on refrigerant and electricity-hungry compressors to create a supply of cool water. A computer room air handling system blows warm air over coils filled with this water, cooling the air to the desired temperature. The heat transferred by the air to the chilled water is ultimately released to the external environment via one or more cooling towers.
By contrast, Facebook uses a more water- and power-efficient approach called evaporative cooling to cool many of its data centers. Evaporative cooling blows fresh, outside air through a water-soaked membrane. The incoming air warms the water in the membrane, turning it into vapor. This transfer of energy cools and humidifies the incoming air, which is then directed to the data center’s air handling system. Facebook claims that evaporative cooling techniques allow its data centers to use about 50% less water than a typical data center.
In more severe environments — locations where the incoming air could introduce high levels of dust, extreme humidity or high levels of salinity, all of which could contaminate the server environment — Facebook uses a more water-intensive air-cooling process called StatePoint Liquid Cooling to maintain its data centers at the optimum temperature.
Sticking With Recycling
Not all data center operators are as advanced in their efforts to drive down PUE as Facebook. Companies such as Evoque Data Center Solutions, which acquired AT&T’s portfolio of data centers in 2019, however, are still conscientious about their use of water to cool their data centers.
“Many of our data centers, which rely on water-cooled chillers, are using gray water as part of the mix of water used in their cooling towers,” said John Diamond, Evoque’s vice president of engineering. “We’ve also engaged a national chemical treatment contractor to help us maximize the number of times we can cycle water through the cooling towers before flushing its chemicals out of the system.”
Water used in cooling towers, he explained, must be periodically treated with chemicals to keep it free of biological and chemical contaminants that could foul cooling tower equipment.
Expanding Access to Renewables
Of course, a PUE of 1.0 can still be damaging to the planet if the power being consumed by server farms is not coming from renewable sources.
Unfortunately, the large energy demands of a typical data center usually make it impractical for its operators to maintain on-site renewable energy farms (wind or solar). And historically, the federal regulatory environment has made it difficult for companies to purchase renewable energy directly from their local utility company.
One way to overcome this regulatory problem is for operators of green data centers to purchase renewable energy at the wholesale level in the same electric grid where a data center resides. The power is then sold back into the local grid in exchange for renewable energy credits. The use of these credits adds renewable energy to the local grid while reducing the carbon footprint of data centers.
Another way to bring more renewable energy to green data centers is through an emerging concept called green energy tariffs. Offered by local utility companies, green tariffs allow a company such as Facebook or Google to effectively invest in a new renewable energy project by agreeing to purchase energy at a specified rate for a specified amount of time. In return, the local utility can direct electricity output from that new wind or solar facility to one or more of its local customers.
Matching Energy Demands
It’s important to point out that green data centers will likely never be powered exclusively by renewable energy. Their high energy consumption and 24/7 workload demand that they stay connected to local, reliable sources of power, whether that power comes from renewable energy sources or not.
In this context, Google claims to have matched 100% of its annual electricity consumption in 2017 and 2018 with purchases of renewable energy. Similarly, Facebook claims that 75% of its power purchases came from renewable sources in 2018 and that it expects that number to reach 100% in 2020, largely by continuing to invest in renewable energy projects located on the same electric grid as its data centers.
Amazon Web Services also claims that more than 50% of its operations were powered by renewable sources of energy in 2018. The company expects to reach 100% renewable energy sources within several years.
Building in Sustainability
Another less discussed aspect of green data centers, but one that goes to the heart of creating a sustainable future concerns the purchase, utilization and disposal of server hardware. Data centers can house as many as 80,000 servers, each with an average lifespan of three to five years.
Given this math, there are incentives to minimize the amount of non-renewable materials used to produce hardware and to maximize opportunities to recycle or refurbish that hardware when it reaches the end of its useful life within the data center environment.
Facebook takes a holistic approach to reduce the amount of non-renewable, petroleum-based plastics used to produce its computer servers. For example, the company identified a material called natural fiber-filled polypropylene (NFFPP), a plastic composite material manufactured in part from natural, renewable jute fibers, that could be used to produce many small components used in server manufacturing such as cable clips and bus bar covers. The company is also conscientious about how it reuses, recycles and remarkets its servers and server components.
Facing Down Climate Change
As society’s demand for data and the forces of climate change continue to grow, so too will the demand for green data centers and their focus on a sustainable future. It may or may not be too late to avoid a showdown with climate change, but every owner of a digital device and every operator of a data center still has the opportunity to make a difference in the fight.
If you’d like to see how green your future could be working for one of the world’s great technology companies, check out Northrop Grumman career opportunities.