Artificial intelligence has become a transformative force, changing the way we process, analyze and use data. As the AI market is expected to reach a staggering level $407 billion by 2027this technology continues to revolutionize many industries, with annual revenue expected growth rate of 37.3% between 2023 and 2030.
By Marc Garner, Senior Vice President Secure Power Europe, Schneider Electric
The AI market has the potential to grow even further, thanks to the rise of generative AI. 97% of business owners believe ChatGPT will benefit their organizations, through uses such as streamlining communications, generating website copy, or translating information, but growing adoption will undoubtedly require investment and infrastructure more important than ever for AI-based solutions. So how can we meet the demands of this new AI-powered world?
Data centers are the critical infrastructure that supports the artificial intelligence ecosystem. Although AI requires large amounts of energy, AI-driven data analysis can help bring data centers closer to net zero and play a positive role in solving the sustainability challenge. Here we explore the four key AI attributes and trends that underpin data center physical infrastructure challenges: power, racking, cooling, and software management.
How to deal with the rise in power-hungry AI applications
Power, cooling, racks and physical infrastructure are essential to the success of a data center. Storing and processing data to train machine learning (ML) and large language models (LLM) leads to a steady increase in energy consumption. For example, researchers estimate that the creation of GPT-3 consumed 1,287 megawatt hours of electricity and generated 552 tonnes of CO2 — the equivalent of 123 gasoline-powered passenger vehicles driven for a year. Additionally, data centers are adopting high-density racks capable of accommodating a greater number of servers in a smaller space, further increasing power requirements.
So how can we meet these growing demands for AI power, while minimizing its impact on the planet? Data centers are continually evolving to meet the growing power demands of AI clusters. Improving power distribution systems and energy efficiency within data centers helps minimize losses and ensures that energy is delivered to servers in the most efficient manner possible. When operators design and manage data centers, they must focus on energy-efficient hardware and software, while diversifying energy sources to provide the secure and abundant energy AI needs to thrive.
Additions such as advanced power distribution units (PDUs), intelligent management and high-efficiency power systems, as well as renewable energy sources, enable data centers to reduce both costs energy and carbon emissions. However, the extreme power densities of AI training servers can create additional issues beyond power consumption: cooling, for example, can also create complex challenges for operators.
The transition from air cooling to liquid cooling is essential to increase sustainability
Today, designing sustainable and resilient data centers relies on efficient cooling. The demands that AI places on data centers mean that powering high-density servers requires new cooling methodologies for optimal performance and minimized downtime.
Although air cooling is common in the industry and will still exist in the coming years, the transition from air cooling to liquid cooling will become the preferred and necessary solution for data centers to efficiently manage AI clusters . This is because traditional air cooling systems become less efficient for high density configurations.
Here, Direct-to-Chip liquid cooling, where a cooling fluid circulates through servers to absorb and dissipate heat, is quickly gaining popularity because it is more efficient at handling the concentrated heat generated by AI clusters.
Compared to air cooling, liquid cooling offers many advantages to data centers. Whether it’s improved processor reliability and performance, space savings through higher rack densities, or greater thermal inertia with water in the pipes, cooling liquid increases energy efficiency, improves energy utilization and reduces water consumption.
Turning technology on itself
Another way data center managers can cope with the growing demands of AI is to use the technology to their advantage. Data centers can benefit from AI-driven automation, data analytics, and machine learning to find opportunities for efficiency gains and decarbonization. By using insights from data more effectively, we can adopt new, more sustainable behaviors.
This process relies on physical infrastructure and software tools that support data center design and operation, including DCIM, EPMS, BMS, and digital twins. These applications reduce the risk of unexpected behavior with complex power networks and provide a digital replica of the data center to identify limited power and cooling resources to inform layout decisions.
For example, Equinix improved data center energy efficiency by 9% using AI-based cooling, which allowed the company to reduce the energy consumption of cooling systems by regulating them more efficiently and making the system more efficient.
Get more computing power with the same physical footprint
What is clear is that AI applications are increasing energy consumption in data centers at a time when they need to become more sustainable. Yet AI also provides the intelligence needed to design and operate data centers in smarter, more energy-efficient ways, and if deployed correctly, can help the planet achieve carbon neutrality.
By combining key attributes of physical data center infrastructure with the efficiencies of AI, owners, operators and end users can more efficiently manage the power demands of high-density AI clusters while while maintaining efficiency, reliability and durability.