In the process of digital transformation of the economy, humanity has to build more and more data centers. Data centers themselves must also be transformed: the issues of their fault tolerance and energy efficiency are now more important than ever. Facilities consume huge amounts of electricity, and failures of the critical IT infrastructure hosted in them are costly for businesses. Artificial intelligence and machine learning technologies are coming to the aid of engineers - in recent years they have been increasingly used to create more advanced data centers. This approach increases the availability of facilities, reduces the number of failures and reduces operating costs.
How does it work?
Artificial intelligence and machine learning technologies are used to automate operational decisions based on data collected from various sensors. As a rule, such tools are integrated with DCIM (Data Center Infrastructure Management) class systems and allow predicting the occurrence of emergency situations, as well as optimizing the operation of IT equipment, engineering infrastructure, and even maintenance personnel. Very often, manufacturers offer cloud services to data center owners that accumulate and process data from many customers. Such systems generalize the operating experience of different data centers, therefore they work better than local products.
IT infrastructure management
HPE advances predictive analytics cloud service
Power supply and cooling
Another area of ββapplication of AI in data centers is associated with the management of engineering infrastructure and, above all, with cooling, the share of which in the total energy consumption of a facility can exceed 30%. Google Corporation was one of the first to think about smart cooling: in 2016, together with DeepMind, it developed
Other examples
There are a lot of innovative smart solutions for data centers on the market and new ones are constantly appearing. Wave2Wave has created a robotic fiber-optic switching system for automated cross-connection in traffic exchange nodes (Meet Me Room) inside the data center. Developed by ROOT Data Center and LitBit, the system uses AI to monitor standby gensets, while Romonet made a self-learning software solution for infrastructure optimization. Vigilent's solutions use machine learning to predict failures and optimize data center temperatures. The introduction of artificial intelligence, machine learning and other innovative technologies for process automation in data centers began relatively recently, but today it is one of the most promising areas for the development of the industry. Today's data centers have become too large and complex to manage efficiently by hand.
Source: habr.com