Get any group of people talking about data and you get a cross section of responses, from line folks who say, “I don't care about data, just give me hardware,” to engineers who ask “Why should I care?” to analysts who plead “Give me more.”

As we gather more intelligence from the field, we are finding exponentially expanding uses for this data. But we don't need to collect data in hopes that a use might spring up, for, as an industry, we have identified plenty of high-value uses already.

My friend Bill Menge, director of smart grid at Kansas City Power & Light (KCP&L), is heading up a US$50 million, 14,000-plus meter smart grid demonstration project, funded by American Recovery and Reinvestment Act (ARRA) stimulus dollars and located in midtown Kansas City, Missouri, U.S. Menge invited me over to the project zone where I saw the up and operating electric-vehicle charging station as well as a home wired with a recently installed smart meter and associated Zigbee home area network, including a smart thermostat. I even got to play with the Tendril customer Internet support portal, through which one can control lighting, heating and air conditioning and use real-time data to change energy usage patterns. With smart thermostats installed throughout its service territory, KCP&L already has the ability to shift 48 MW of load when conditions warrant.

Probably my favorite data jockey is Glenn Pritchard, the technology lead for PECO's smart grid/smart meter project (see “PECO to Upgrade Metering Technology,” T&D World, May 2011). I first worked with Pritchard nearly 30 years ago when we were both involved in dynamic line and substation ratings. It's nice to see that dynamic line rating tools are now going live in control centers (another smart tool in our smart grid arsenal). Sometimes the wheels of progress turn slowly.

Today, Pritchard and his team also are building out their smart grid aided by $200 million in ARRA stimulus funding. You might recall that Hurricane Irene blew into PECO's service territory in southeastern Pennsylvania in August 2011, creating more than 500,000 power outages. PECO was able to use its smart grid tools to aid in the restoration efforts. As a result, service was restored to almost all customers in just 72 hours. Access to automated meter reading/outage management system data made a huge difference, enabling PECO to reduce first-responder dispatches by 2,300 for single customer events and by 350 for primary events. Reduced truck rolls and, particularly, the estimated two-day reduction in the storm-restoration effort resulted in savings valued at almost $10 million (see “Irene Puts Smart Technology to the Test” in this issue).

Don Lamontagne, another data junky friend of mine, developed a system that looks at the condition of oil in transformers. Lamontagne, manager with Arizona Public Service (APS), developed the Transformer Oil Analysis and Notification (TOAN) system (see “Dissolved Gas Analysis: Continuous or Annual?” T&D World, December 2010). Lamontagne is an engineer's engineer, having received the Edison Electric Institute's 2008 Edison Award for his outstanding contributions to the advancement of the power industry. So be careful. If you ask Don the wrong question, you might be bombarded with information about the application of Poisson and Weibull statistics to failure predictions.

But Lamontagne, at his core, is quite practical.

Lamontagne pointed out that you can't always make a business case for getting the data until you have enough data to see what's possible. But investing in data gathering before you know the full potential benefits is a hard sell to utility management — a catch 22 — that is, unless you have an incident, such as with the transformer fires in 2004 that cost APS about $28 million and took two years to restore full capacity. Lamontagne was already working on this issue, but that incident provided the support so that he could develop this highly sophisticated transformer monitoring and predictive failure analysis scheme. And the results are already in: The system has already pinpointed two deteriorating transformers that were taken out of service before catastrophic failure could claim them.

I asked Lamontagne about data-transfer rates and communications platforms for smart grid and the like, but he leaves a lot of the communications protocol issues to others, saying, “Just get me the data.” From his perch, Lamontagne can't see all the requirements throughout APS for data, but he understands the need to get the correct data at the right time.

The big question boils down to “What smart grid data do we need to prioritize spending on the aging delivery system?” And the answer is “We don't know yet.”

As Menge pointed out, “I can have all the data in the world and decide not to analyze it. But I can't analyze data I don't have.” Too much data is okay, not enough will be disastrous.

So, as we look to modernize our grid, we are at the junction where we need data to make cogent investment decisions. Or, to put it more simply, we need data to improve reliability while holding down costs. To sum up in Lamontagne's words: “Just get me the data.”