Since acquiring DeepMind Technologies back in 2014, Google has been having all kinds of fun using AI for various tasks. They’ve recently partnered with Moorsfield Eye Hospital to help scan for early signs of disease, they’ve even taught the AI to play Go — and win. Machine learning has come a long way in a short period of time. Most recently, Google has used its DeepMind AI to look for inefficiencies and ways to improve the operation of their data centers. It’s done quite well, resulting in a 40% reduction in cooling costs.
If you’ve ever worked with a laptop sitting on your legs, you’ll know that after awhile that laptop will start to get warm. Now imagine that, but multiplied by a pretty significant magnitude, and you’ll start to get the idea of the type of heat that is generated by Google’s data centers. Google has already been pretty proactive in using innovative ways to cool their data centers and make them more efficient. Further efficiency will not only save the company a significant amount of money, but also allow them to use less energy overall, which will benefit the environment in a positive way.
To get started, the team had to “teach” DeepMind what it should be paying attention to:
We accomplished this by taking the historical data that had already been collected by thousands of sensors within the data centre — data such as temperatures, power, pump speeds, setpoints, etc. — and using it to train an ensemble of deep neural networks. Since our objective was to improve data centre energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data centre over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints.
After a learning period, the neural networks were applied to a live data center for testing. The results, as you can see in the graph below, are quite impressive.
Google explained the results:
Our machine learning system was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling, which equates to a 15 percent reduction in overall PUE after accounting for electrical losses and other non-cooling inefficiencies. It also produced the lowest PUE the site had ever seen.
They also plan to apply DeepMind AI machine learning to other aspects of their data centers, as well as other facilities in the future. A few examples for future applications included improving power plant conversion efficiency, helping manufacturing facilities increase throughput, and others. Google also plans on sharing their findings, allowing other data center operators to reduce their costs, as well as their impact on the environment.
What would you like to see DeepMind AI machine learning work on next? Tell us what you think in the comment section below, or on Google+, Facebook, or Twitter.Source: DeepMind Blog