Robots in the data center: how can artificial intelligence be useful?

In the process of digital transformation of the economy, humanity has to build more and more data centers. Data centers themselves must also be transformed: the issues of their fault tolerance and energy efficiency are now more important than ever. Facilities consume huge amounts of electricity, and failures of the critical IT infrastructure hosted in them are costly for businesses. Artificial intelligence and machine learning technologies are coming to the aid of engineers - in recent years they have been increasingly used to create more advanced data centers. This approach increases the availability of facilities, reduces the number of failures and reduces operating costs.

How does it work?

Artificial intelligence and machine learning technologies are used to automate operational decisions based on data collected from various sensors. As a rule, such tools are integrated with DCIM (Data Center Infrastructure Management) class systems and allow predicting the occurrence of emergency situations, as well as optimizing the operation of IT equipment, engineering infrastructure, and even maintenance personnel. Very often, manufacturers offer cloud services to data center owners that accumulate and process data from many customers. Such systems generalize the operating experience of different data centers, therefore they work better than local products.

IT infrastructure management

HPE advances predictive analytics cloud service InfoSight to manage IT infrastructure built on Nimble Storage and HPE 3PAR StoreServ storage systems, HPE ProLiant DL/ML/BL servers, HPE Apollo rack systems and the HPE Synergy platform. InfoSight analyzes the readings of the sensors installed in the equipment, processing more than a million events per second and constantly learning itself. The service not only detects malfunctions, but also predicts possible problems with the IT infrastructure (hardware failures, depletion of storage capacity, reduced performance of virtual machines, etc.) even before they occur. For predictive analytics, VoltDB software is deployed in the cloud, using autoregressive forecasting models and probabilistic methods. A similar solution is also available for hybrid storage systems from Tegile Systems: IntelliCare Cloud Analytics cloud service monitors the health, performance and resource usage of devices. Dell EMC also uses artificial intelligence and machine learning technologies in its high performance computing solutions. There are many similar examples, and almost all leading manufacturers of computing equipment and data storage systems are now following this path.

Power supply and cooling

Another area of ​​application of AI in data centers is associated with the management of engineering infrastructure and, above all, with cooling, the share of which in the total energy consumption of a facility can exceed 30%. Google Corporation was one of the first to think about smart cooling: in 2016, together with DeepMind, it developed artificial intelligence system for monitoring individual components of the data center, which allowed to reduce energy costs for air conditioning by 40%. Initially, she only gave hints to the staff, but was subsequently improved and now she can control the cooling of the machine rooms on her own. A neural network deployed in the cloud processes data from thousands of indoor and outdoor sensors: it makes decisions based on server load, temperature, as well as wind speed outside, and many other parameters. The instructions offered by the cloud system are sent to the data center and there they are once again checked for security by local systems, while the staff can always turn off the automatic mode and start managing the cooling manually. Nlyte Software collaborated with the IBM Watson team to create decision, which collects data on temperature and humidity, power consumption, and IT equipment utilization. It allows you to optimize the work of engineering subsystems and does not require connection to the manufacturer's cloud infrastructure - if necessary, the solution can be deployed directly in the data center.

Other examples

There are a lot of innovative smart solutions for data centers on the market and new ones are constantly appearing. Wave2Wave has created a robotic fiber-optic switching system for automated cross-connection in traffic exchange nodes (Meet Me Room) inside the data center. Developed by ROOT Data Center and LitBit, the system uses AI to monitor standby gensets, while Romonet made a self-learning software solution for infrastructure optimization. Vigilent's solutions use machine learning to predict failures and optimize data center temperatures. The introduction of artificial intelligence, machine learning and other innovative technologies for process automation in data centers began relatively recently, but today it is one of the most promising areas for the development of the industry. Today's data centers have become too large and complex to manage efficiently by hand.

Source: habr.com

Add a comment