Here's how DeepMind-powered AI help cut down Google's electricity bill

Google along with DeepMind has managed to cut down power usage by 40 percent.


Ever since the smartphone hardware innovation pace has slowed, there haven't been many wow moments at technology events. Today, companies are diving deep into what we may call new evolving technologies such as bots, VR, AR and AI. Bots are relatively new and AR has proved its might with Pokemon Go, but what's caught our attention is how Google along with DeepMind has managed to cut down power usage by 40 percent. Showing us how it is capable of more than just playing games.

Google has used the AI research firm DeepMind and applied machine learning at its two datacentres that power Google's services to improve efficiency. Basically, the system is being taught to respond to the demand and reduce the amount of electricity needed whenever it is possible to do so. Mustafa Suleyman, the co-founder of DeepMind told Wired, "What we've been trying to do is build a better predictive model that essentially uses less energy to power the cooling system by more accurately predicting when the incoming compute load is likely to land.

In the past, we have seen AI being put to use my many leading companies. Remember, How Old Do I Look service by Microsoft that had managed to grab eyeballs. There are numerous ways in which machine learning is being used. Not just tech giants, we have seen startups use AI for matching the best resume to a job vacancy and finding the best tutor for your kids to intelligence assortment for e-commerce sites. But, Google could, well, make a new business model out of this.

Google and Deepmind could put these algorithms and methods to use by transferring them onto larger AC systems and even manufacturing plants, adds the report.  This will help save power for the company, and essentially work at curbing power wastage. DeepMind team is known to have collected five years of data by data centres to build the prediction model. It helped understand the amount of energy that would be required by the data centre based server usage. The AI managed to get about 40 percent reduction in power used.

"We accomplished this by taking the historical data that had already been collected by thousands of sensors within the data center – data such as temperatures, power, pump speeds, setpoints, etc. – and using it to train an ensemble of deep neural networks. Since our objective was to improve data center energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data center over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints," Google explained in a blogpost.

It led to 40 percent reduction in the amount of energy used for cooling, which was equal to a 15 percent reduction in overall PUE after accounting for electrical losses and other non-cooling inefficiencies.

Now, we don't really need to emphasise on the importance of datacentre in the modern day. Among other things, the networks are typically made to accommodate growth, workload and newer type of traffics and take advantage of cloud automation. It plans to roll out the system and more details in a publication so that it can be used by the benefit of environment. Or maybe Google could build a new business here.

Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.