DARM is an overarching asset and risk management system that will provide real-time operational information, management decision-making analysis and forward forecasting for decision support. DARM will provide the ability to look across all aspects of the operation of assets to fully assess risks and provide enough coordination to mitigate them at the lowest cost or with the best solution.

DARM will be a real-time user of data streams. It will be both an administrative overlay and a merger of business processes, tools and methodologies. It will be both visualization technology and a forecast tool. LIPA has not yet put all the components that make up DARM into place. However, each building block placement and each technical improvement will be a stepping stone to the larger vision.

Data Integration and Information Management

LIPA’s first step with DARM was based on the one-too-many integration solutions where enterprise data sources ware mapped to and integrated with the single data model in the Electric Power Research Institute’s (EPRI’s) Maintenance Management Workstation (MMW) model.

MMW provided LIPA with tools for work prioritization back in the early 2000s. Subsequent pilot projects focused on testing the use of the common information model, which was emerging as the standard for the industry’s data modeling. Implementation included development and performance evaluation of infrastructure using a utility integration bus. These early projects demonstrated the viability of data integration and process automation but were often cumbersome and custom built.

Today, LIPA’s enterprise information management solution includes the common information model as a base for data standardization. LIPA’s technical information technology (IT) architecture solution uses an off-the-shelf enterprise service bus (ESB) as a part of its service-oriented architecture to manage data integration and data exchange between critical systems. This also provides a virtual database for business intelligence data mining and analysis. ESB is designed to support process automation and will enable future orchestration of applications in combination with complex event triggering.

Central Data Model and Automation

One of the key elements of LIPA’s enterprise information management concept is a centrally managed data model. The key enabling solution for cost-effective data integration is an automated data model management tool. This tool keeps the data model current with evolving industry data standards in a way that is practical for use across critical systems and a large number of interfaces in a useful, manageable and cost-effective way.

LIPA plans to complete the transition from nonconforming to standards-based data and systems integration over the life-cycle time of critical systems. All new systems and integration interfaces to legacy systems will be based on current industry standards.

It Is Here

LIPA has worked extensively within the industry to develop the thinking around these concepts. Most of the early adopters of reliability-centered maintenance in the 1990s discovered maintenance optimization and maintenance work prioritization are just the tip of the opportunity iceberg for asset and system performance improvement.

The first EPRI conference on T&D asset management was held in 2000 in New York, New York, U.S., and hosted by Con Edison. The event was so popular with utility managers and experts that participation was necessarily limited because of the high level of industry interest in the topic. At that time, LIPA was one of a handful of U.S. utilities working to define T&D asset management through a collaborative EPRI project. In the mid-2000s, LIPA took a further step, defining its asset management approach as risk-based asset management. LIPA began to see the need for a more complex and predictive approach by defining it as dynamic asset risk management.

LIPA has been making a step-by-step transition to standards-based IT infrastructure, which has proven to be a lower-cost and lower-risk approach. The process is centralized and automated to the maximum degree possible, ensuring consistency and efficiency. The practices LIPA uses in request for proposals, vendor selection and product selection require industry standards compliance and interoperability. Internally, the commitment is to build and maintain T&D IT infrastructure in compliance with industry standards. Integrations also are designed to leverage benefits of near plug and play for standards-based systems and applications. Together, these transition strategies help to lower risk, lower cost and enable using best-of-breed products from various vendors.

Need for New Tools

Key elements of LIPA’s DARM concept include the use of probabilistic methodologies in assessing and optimizing risk of achieving the goals. The dynamic part of DARM indicates the need to perform risk assessment and optimization in a near-real-time and continuous way.

More extensive use of the planned methodologies is sometimes limited by a combination of practical computational problems. The latter includes the long computing time required to perform complex studies with large numbers of calculations using combinations of possible values of study parameters. Today, even existing deterministic studies and tools require hours and, in some cases, days for relatively simple study and optimization tasks.

In those situations, it is expected some combination of high-performance or cloud computing with more effective probabilistic calculation methodologies will need to be applied to focused aspects of use cases of interest. LIPA, again, is taking a step-by-step approach in developing and implementing new technology. This includes the use of more powerful computers and cloud computing, along with existing deterministic tools and improvements in methodologies for probabilistic calculations and analysis. This approach also anticipates the need to develop new, scalable tools and methodologies to improve efficiency and reduce the required computing time for complex and multi-parameter optimization use cases.

LIPA has started working with Brookhaven National Laboratory (BNL) to leverage the availability of its supercomputer and cloud computing, and to coordinate future efforts with BNL’s probabilistic risk assessment team. The resulting capability eventually will run existing deterministic and probabilistic tools faster and more frequently, closer to real time and in probabilistic mode.

A jointly developed road map includes evaluating options to reduce the number of iterations of some models while still obtaining results that have the required accuracy. Further improvement is expected by developing methodologies and optimization algorithms to perform focused studies for limited and specific operating areas and conditions. Longer-term planned activities include the customization of tools used in other industries and the development of new and specialized tools.