In the IREQ Smart Grid Integration Lab, different data sources are combined and checked for consistencies.
Evolving technologies, including the smart grid, can provide electric power utilities with unprecedented capabilities for forecasting demand, shaping customer usage patterns, preventing outages, optimizing unit commitment and more. At the same time, these advances also generate unprecedented data volume, speed and complexity. One aspect of the smart grid evolution is the omnipresence of communications and information technologies (IT) to have better knowledge of the state of the grid and to make more efficient decisions.
To manage and use this information to gain insight, utilities such as Hydro-Québec must be capable of high-volume data management and advanced analytics to transform data into actionable insights.
When thinking about the smart grid, it is far from obvious the electric utility industry has all the answers on what IT architecture will support it. Even before the smart grid, utilities were struggling with IT challenges. But the smart grid brings the big-data dimension, which can make things even more challenging.
More and More Data
Big data is known as the four V’s. It is not only about massive amounts of data represented as volume, it is also velocity, variety and veracity. Velocity is the speed at which utilities get the data. A phasor measurement unit is a good example. Variety is the heterogeneity of the different sources of data. The last dimension of big data, but not the least, is veracity. The veracity of the data is about its accuracy and truthfulness. Improving the veracity of data requires minimizing the occurrence of different sources of errors. These sources are related to inconsistencies, duplication and missing data.
In a recent survey, IBM found one in three business leaders do not trust the information they use to make decisions. Gartner research shows that poor data quality is cited as the No. 1 reason for overrunning project costs. According to The Data Warehousing Institute, the cost of bad, or dirty, data exceeds US$600 billion for U.S. businesses annually. In an infographic, InsightSquared stated the following:
- Data quality best practices can boost revenue by 66%.
- Poor data quality across business and government costs the U.S. economy $3.1 trillion a year (insidearm.com).
- Data quality is a barrier for adopting business intelligence/analytics products for 46% of survey respondents.
Electric power utilities need accurate data and cross-sectional information to make valuable business decisions. Building an enterprisewide unified information view is a complex task because of the heterogeneity and lack of consistency in the different sources. Still, decision makers need only one version of the truth. How can common ground be reached?