Nuanced Anomalies: How Utility Companies Manage Risk with Data
In various parts of the United States, you’ll hear a familiar expression said a number of different ways. “If you don’t like our weather, just wait five minutes.” From sudden Spring snowstorms to 50-degree temperature swings, utility companies across the country have been troubleshooting unpredictability since their inception. Modern times have brought more challenges – new consumer behavior patterns, increased regulations, and sophisticated cyberattacks – confirming that the only constant for the utilities sector is change.
Change can also be a good thing though, and recent (long overdue) innovation has transformed and strengthened a historically risk-averse industry. Artificial intelligence (AI) and other digital tools have helped the sector elevate the capabilities of the existing workforce, be better equipped to service customers, and able to automate certain core functions like anomaly detections and infrastructure condition assessment. Anomalies in transmission and distribution lines play a major role in the reliability of power systems, and early detection is critical for minimizing stress, avoiding outages, and preventing surges that can lead to forest fires. However, predicting where, when, and how those anomalies will occur remains an imperfect process.
What You Don’t Know Can Hurt You
Other than keeping the power on, maintaining assets is one of the primary concerns for utility companies. Traditionally, field service technicians and engineers would share an outsized responsibility for these assets, regularly monitoring systems for potential anomalies and degradations. Recent digital transformation efforts have eased this burden considerably with remote asset inspection, smart grids, and smart meters able to analyze streams of data to help get ahead of possible (and probable) issues. A big challenge of late has been that word “possible” – because these days, anything seems possible.
Let’s break down what’s going on.
Utility companies are responsible for maintaining their T&D system, the infrastructure that transports electricity from power plants to residential and commercial customers. This infrastructure is outside and thus subject to various external factors like shifts in terrain, severe weather, and natural disasters. Regional utility leaders are keenly aware of the vulnerabilities of their unique infrastructure, but the damages to utilities are universal, not mutually exclusive, and not confined to a single geography. Meaning, when a coastal utility company trains its AI on problems unique to its region like rust, it fails to detect anomalies that may be regular occurrences in other geographic areas, like high winds in Indiana. Though infrequent and nuanced, these anomalies can cause detrimental breakdowns in the infrastructure.
Preparing for Nuanced Anomalies
Utility companies are spending millions automating remote asset inspection tools and AI that can monitor, assess, and repair the many unique variables a technician or engineer would encounter on-site. However, AI is only as good as the data it learns from, and works best when it is constantly learning from multiple data sets. It’s very much nurture over nature, and if you only feed the AI data from a single data set then it will only be able to scenario plan based on those limited inputs. Data in an echo chamber is of little value in mitigating disasters, and disasters are getting worse.
Using climate change as an example, according to McKinsey utilities are “more vulnerable to extreme weather events” than in the past. So, if the likelihood of a hurricane is extremely low at a power plant in Kansas for instance, their AI might not have been fed data that would allow it to train on how to predict or respond to that type of event, which could prove disastrous as hurricanes push farther inland. And it’s not just the frequency and severity of these weather events that are increasing, but also the financial impact.
“A typical utility saw $1.4 billion in storm-damage costs and lost revenues due to outages caused by storms over a 20-year period,” based on analysis by McKinsey. “By 2050, the cost of damages and lost revenues would rise by 23 percent ($300 million), or approximately two to three additional years with major hurricane damage. Conservatively, estimates total $1.7 billion in economic damage for each utility by 2050.”
(For the record, Kansas did indeed suffer the impact of a freak “inland hurricane” in 1990 – resulting in more than $80 million in damages. So, if you think it won’t happen, just wait.)
3 Considerations When Building (or Investing In) an AI Solution
The utility industry’s major players have been experimenting with building their own AI-based anomaly detections for some time, but constraints on budget, skilled labor, and available data have hampered progress. For utility leaders deciding whether to build their own AI or invest in a third-party solution, here are three questions to ask:
1. What is your highest digital transformation priority?
Digital transformations are difficult to execute. According to BCG, despite 80% of companies planning to accelerate their companies’ digital transformations, only 30% of transformations succeed in achieving their objectives. “Delivering such fundamental change at scale in large, complex organizations is challenging, especially with short-term pressures,” the analysis found.
Utility leaders should triage their digital needs like any other business decision – do you focus on short-term priorities, or invest in long-term futureproofing? There are cases to be made for each, and management teams need to have a candid conversation about expectations before the process begins, not during.
2. Where are you getting your data?
Companies are obsessed with gathering data, but many don’t know how to use it. Moreover, is it the right data? Is it tied to quantifiable business metrics? Has it improved workflows? AI is a tool, not a product, and it can’t solve every problem. Utility companies need to analyze where their data comes from and invest in solutions based on multiple sources to avoid data becoming siloed. There is a lack of data acquisition standards across the industry, as well as a lack of varied and critical data required for training anomaly detection algorithms, which can limit utilities in their quest to build a successful algorithm to detect the next anomaly.
3. Should you bring on an outside data partner, and how quickly will you see improvement?
After addressing digital transformation priorities and sources of data, the decision on how to assign the work is a big one – keep it in-house, or outsource? Given the nascent status of AI in the utilities sector, the technology often works best with an assist from a real person. Referred to as “human-in-the-loop”, this approach involves engineers and other technicians in the decision making, helping correct the AI and match its learning to real-world conditions. Especially with anomaly detections, the human-in-the-loop approach strengthens AI insights with the institutional knowledge and on-the-job experience of workers (i.e. multiple sources of data).
We’re All in this Together
Utility companies quite literally keep the lights on in this country, and their ongoing vitality is a national security concern. To that end, the government has recently made energy infrastructure one of its top priorities and will be putting increased pressure on defending these assets. So, it looks like even more change is on the way for the utilities industry. What could go wrong?