Power management is the latest aspect of data center operations to be scrutinized as enterprises and data center operators grapple with how to elevate overall efficiency while lowering costs. As a result, machine learning (ML) and predictive analytics are gaining momentum as the keys to delivering the necessary insights to help optimize just about every facet of the data center, including the power infrastructure.
In a recent Data Center Knowledge article, Jennifer Cooke, research director of IDC’s Cloud to Edge Datacenter Trends service, asserted that data-driven decisions and the ability to leverage all that data to improve outcomes is the only sustainable way to meet the needs for IT services at scale. Clearly, the opportunity to add a layer of intelligence is the only truly effective approach to anticipate and alleviate the impact of sudden data center changes, outages or surges in resource demand.
But how do you effectively predict something that can be exceedingly unpredictable, like power? Is it feasible to make power management smart enough to anticipate and rectify spikes in seconds? Google certainly thinks so, as the company relies heavily on Artificial Intelligence (AI) to cut power usage. Together with its DeepMind AI company, Google controls cooling and energy utilization in data centers that support critical services, including Google Search and YouTube.
Every five minutes, a snapshot of the data center cooling system is collected from thousands of sensors and fed into deep neural networks to predict how different choices will affect future energy consumption. The reality, however, is that data center operators need to know what’s going to happen in the next five seconds, next two weeks and over the next year in order to optimize power management and improve overall energy utilization.
Both ML and predictive analytics are core to proactive, Software Defined Power (SDP) solutions, which collect data about connected power systems and share real-time views of power usage patterns across individual racks and rows as well as entire data centers. SDP-enabled power components can react quickly and effectively to reduce the impact of shifts in power-usage patterns. Added intelligence and analytics make it possible to proactively identify which workloads should be re-routed automatically based on application performance needs along with power capacity and availability.
Each prediction looks at workflow patterns and behaviors at the application level, including what happens at certain times of the day or even times of the year. Consider a retail application that experiences an inordinate spike on Black Friday. A smart SDP solution will know that in advance by leveraging ML algorithms to anticipate the rapid increase while automatically adjusting other workloads and rerouting unused power capacity to where it’s needed most.
Recommendations to move workloads aren’t confined to just one data center. They can encompass multiple locations based on the time of day, energy pricing and availability of alternative power sources, such as solar or wind energy. The SDP functionality, called “peak shaving,” controls a mix of utility power and local battery storage to create dynamic capacity that can be moved around a data center on-demand to meet any level of application needs.
ML and predictive analytics inform a series of what-if scenarios and simulations that are essential to planning data center consolidations, migrations and buildouts. For data center providers, the ability to assess present consumption and forecast future needs will prove pivotal in determining how many servers and what kind of capacity will be needed to address future growth rates.
Software Defined Power, fueled by ML and predictive analytics, is the most efficient and effective way to get answers to tough questions about power and optimize energy utilization. Put simply, SDP puts the smarts in power management.