Thanks to DECIX Internet node, Frankfurt am Main is home to so many data centres that between them they consume 25 percent of the city's total electricity. Overall, data centres account for about 0.6 percent of total energy consumption in Germany, and the demand is rising. Therefore, it is important that cloud providers reduce energy consumption to a minimum, not due to rising electricity costs, but also to achieve their own carbon neutral targets.
There are several starting points to help them with their goals. First, the power efficiency of the IT components used, and second - above all - advanced cooling systems and technologies. As a third component, sophisticated AI and data analysis technologies can help to further optimise energy consumption with a view to upcoming developments (extreme weather events, time factors, predictable events). Despite the benefits they deliver, green data centres do not compromise on performance, reliability or security when it comes to operating much more efficiently in the quest to meet those carbon neutral goals.
Cooling as a source of savings When it comes to cooling, there are several approaches to maintaining an optimal temperature. Arguably, it’s wasteful to install air conditioning, bluntly bringing server rooms to a preset temperature, and wastefully blowing the warm air out into the environment. A better approach - as adopted by Alibaba Cloud's third data centre, in Frankfurt - has a free-cooling system that uses naturally cool ambient air instead of artificial cooling.
Other data centres take a different approach. Via heat pump from servers, they use the waste heat to heat surrounding buildings or houses. However, there are technical hurdles to overcome, because the exhaust air heat may be at too low a temperature level to be used sensibly.
Raising server room temperatures intelligently The maximum room temperature also offers potential for optimisation. The higher it is, the less cooling is required. The limiting factor in cooling a data centre is the equipment, which dictates the lowest maximum temperature at which it will still function properly. If there are multiple servers that are close together and generate a lot of heat locally, it may be necessary to ramp up cooling to keep the local room temperature below that limit. By cleverly distributing heat-generating hardware across the data centre (or across multiple rooms), the room temperature can be raised moderately, lowering the necessary cooling power without affecting equipment functionality. This significantly reduces the overall cooling capacity and environmental load.
Cooling air from outside is also possible and has already been used in many cases. This is why many data centres are located in Scandinavia, because the environment there is significantly cooler than in central or southern Europe for many months of the year, and both air and water can be used for cooling.
Liquid cooling tops international rankings Air is only moderately efficient as a coolant (with or without air conditioning). Compressing and circulating large volumes of air requires a lot of energy. Liquid coolants work much better. Alibaba Cloud, for example, uses its ‘soaking server’ technology in its data centres, where servers and other IT hardware components are completely submerged in coolant. The efficiency of liquid cooling is much higher than in traditional air conditioning and fan systems and can reduce energy consumption by 70 percent. With these cooling technologies, data centres can achieve a Power Usage Effectiveness factor (PUE factor) of 1.09. This means that out of 109 kilowatt hours, 100 are consumed by the IT hardware and only 9 by the rest of the technology in the data centre (such as for cooling, lighting, heating, security systems). This marks an important technological advancement. Previously, data centres with a PUE of 1.2 were already considered very efficient.
In addition, companies can use the technology of Air Handling Units (AHU) and air coolers to reduce the WUE (Water Usage Effectiveness) value of the data centre to as low as 0.45 l/kWh. Compared to conventional cooling tower systems, this can save more than 80 percent.
AI and data analysis for sustainable energy optimisation Innovative AI and data analytics technologies can forecast, control, and manage the CO₂ footprint of data centres effortlessly. They make it easy to identify anomalies very quickly and respond accordingly in an automated way to either report them or - where possible - fix them directly. With an advanced SaaS tool like Alibaba Cloud's Energy Expert, which keeps track of and processes real-time data, data centre operators and enterprises can plan CO₂ emissions, set targets, and limit environmental impact. Energy Expert directly helps achieve sustainability certifications as recognised by testing institutes such as TÜV Rheinland.
Artificial intelligence is also helpful in forecasting a data centre's future energy load, as it can fluctuate widely depending on time of day, weather and special events (like shopping and sporting events, festivals and festivities). On a day to day basis, while movies are often streamed in the evening, at weekends or during holidays, business applications are more likely to be under load during work hours. Since servers consume significantly more power at the load limit, clever forecasting and automated load balancing can reduce overall power consumption.
Another factor is fluctuating energy output from renewables such as solar panels and wind turbines, which generate power at a different cost than consumers on the public grid. Again, AI-based algorithms can optimise load balancing to maximise the use of renewable energy.
A factor for the overall power consumption (and not only the PUE value) that should not be underestimated is also the efficiency of the servers. Today's high-performance server chips can accommodate up to 60 billion transistors per chip. This means they outperform the industry standard by up to 20 percent. At the same time, they increase energy efficiency by up to 50 percent.
Migration to the cloud reduces environmental impact Thanks to intelligent algorithms, cloud operating systems are also achieving unprecedented levels of efficiency. Today, thousands of servers around the world can be integrated into a seamless supercomputer, implementing real-time peak processing capacities of 3.63 TB per second. This improves server resource utilisation by 10 to as much as 40 percent, including significant cost reductions.
Therefore, simply migrating one's own IT infrastructure to the cloud is the first and important step towards sustainable decarbonisation: Smaller on-premise data centres often have a poor PUE value of 1.5 or 2. Modern cloud centres, as described, are significantly more energy-efficient and achieve PUE values of 1.2 or 1.1.
Cloud-native infrastructures also facilitate automatic load balancing and optimisation and can automatically move AI applications to systems that have the appropriate hardware. In other words, migration to the cloud not only leads to scalable and flexible infrastructures via which cost savings can be achieved, it also reduces the CO₂ footprint. This is because it specifically optimises energy efficiency and prioritises sustainable energy sources.
Conclusion Professional data centres use a wide range of options to reduce power consumption, use generated heat for other purposes and optimally integrate sustainable energy sources.
Liquid-cooled racks and servers, as well as the use of power-efficient server processors, make a significant leap in energy efficiency. Simply by migrating a data centre to the cloud plays a considerable role in reduces the CO₂ footprint and brings everyone a big step closer to decarbonisation.