Digital technology is getting more dynamic every day and harder to understand, but sometimes the most advanced technologies are the result of timing. It’s connecting the dots, or perhaps bridging the gap because of the ability to understand the data faster or the flexibility to understand what the technology is saying. The non-technical person may talk about stars aligning or a perfect storm of events, but that isn’t the case here. It is taking the old “thinking outside the box” approach to a new level by grabbing existing applications and integrating them into a different function, which is where digital twinning comes in.
The digital twin was introduced almost two decades ago, but some say the concept dates back even further to the period when the first computer-aided design (CAD) systems came into the engineering department. As CAD software matured, engineers were able to develop 3D models of the what they were designing. When combined with automation, the engineers could see how their designs worked. It gave them the ability to see simulations of the devices before they were built.
With that type of a tool, it wasn’t long before engineers started asking, “What if we could monitor the actual equipment?” Maybe they could monitor the health of the device or identify problem areas, or improve efficiencies. The potential was there, and it attracted a great deal of attention. A lot of things fell into place, and keeping it simple, these 3D CAD models evolved into the early digital twin theory.
Collateral Improvements
Smart technology with its intelligent sensors and transducers moved theory into the real world. These devices needed to become markedly more sophisticated, substantially smaller, and much cheaper, which they did. This promoted the concept of interconnectivity and fed the development of sophisticated communications systems such as today’s 5G technology. In this environment, the Industrial Internet of Things (IIoT) technology became possible and brought about dynamic monitoring and controlling of industrial assets and processes.
It helped that HPC (high performance computing) was developed and led the way to new applications like the cloud infrastructure, which was an ideal environment for the big-data these systems generated. This setting is making data storage cheaper and more available to the entire enterprise. It is also a boost to big data analytics and the spreading of asset simulations integrated with artificial intelligence (AI) and augmented reality. Overall, this combination of the physical world with smart technology is being called Industry 4.0, but that subject covers a flock of interesting topics that needs exploration, and like eating the proverbial elephant, digital twinning will be our first bite.
What is Digital Twin?
The digital twin has been compared to a bridge between the real world and the virtual world that has produced tangible tools for the heavy industry. Granted, that tactic is really a simplified summation, but it reflects how everything in the digital technology realm is interrelated in one way or another. Before moving on with the digital twinning discussion, it is important to define exactly what digital twins are. Typically a digital twin is compared to a digital copy of physical assets, but that description only scratches the surface and a digital twin is a lot more than that characterization.
To quote GE Digital, “Digital twins are software representations of assets and processes that are used to understand, predict, and optimize performance in order to achieve improved business outcomes. Digital twins consist of three com- ponents: a data model, a set of analytics or algorithms, and knowledge.”
The digital twin technology is being used by many industries such as aerospace, defense, healthcare, transportation, manufacturing, and energy. Heck, it’s even been used Formula 1 racing for several years. Basically more end users are coming onboard all the time and the list of major players in the market grows every day too. This includes companies such as ABB, Accenture, Cisco, Dassault Systèmes, General Electric, IBM, Microsoft, Oracle, Schneider Electric, and Siemens to name a few.
It is definitely a growth market and a quick check shows some interesting figures. Depending on which study is read or which expert is quoted, the global marketplace was about US$3.8 billion in 2019 and the projected growth is estimated to range from US$35 billion to US$40 billion by 2025 at a CAGR (Compounded Annual Growth Rate) anywhere from 37% to 40%. No matter which figures are picked, the common denominator is the market is growing and it’s growing at an attention getting pace.
Growth is being driven by the benefits digital twin technology offers such as asset management, real-time remote monitoring, real-time and predictive performance evaluation, predictive equipment failure, and other money saving advantages. For the grid, probably one of the most promising digital twin features is improved reliability and resiliency by more situational awareness. Being able to mine big-data for actionable information has proven helpful predicting delays or unplanned downtime. The takeaway for any business is simple, there is a digital twin in its future.
Need For Standards
That said, the power delivery system hasn’t been the quickest industry to deploy digital twin. Cloud-based applications like digital twinning bring the challenge of selecting correct data, the validity of the model, maintaining the process, and cybersecurity threats to name a few items. There are also some very real interoperability concerns (i.e., the digital twins from one supplier may not play well with digital twins from another supplier).
There are no standardized digital twin platforms, and that is a major speed bump for widespread digital twin deployment by utilities. It’s not hard to imagine a utility or several interconnected utilities having a gaggle of digital twins that will not operate together. It is reminiscent of the early days of smart grid when intelligent electronic devices (IEDs) with peer-to-peer protocols were being introduced.
In those early days, IEDs offered amazing features and benefits, but only a few utilities took advantage because it meant sole-sourcing one supplier, and that kept most utilities on the sidelines when it came to deployment. It didn’t take long for all the stakeholders to get behind the development of vendor-agnostic interoperability standards such as IEC-61850. It was hard work, but the results speak for themselves. IEDs have developed into plug-and-play systems that are in use around the world and that needs to happen in digital twinning, but let’s look at some examples of digital twin use.
Digital Twin Projects
Back in 2015, GE Renewables introduced the first digital wind farm to the world. The turbines had sensors and transducers throughout their assemblies monitoring how each turbine was working. These monitoring devices sent big-data to a remote operations center where the digital twin powered by GE’s Predix software provided visualizations and advanced analytics for the operators. Today GE reports it has more than 15,000 wind turbines operating in the digital twin mode.
American Electric Power (AEP) recently announced it has contracted with Siemens to provide a digital twin of their transmission system. Siemens reported, “The AEP project is the largest and most complex to date, partly because AEP’s presence extends from Virginia to Texas. Not only is the digital twin enhancing the utility’s previous data governance strategy, the system has to be flexible enough to accommodate its continued evolution by allowing 40 AEP planners in five states access to the model and to make changes as needed, too.”
Siemens also said, “AEP also wanted a system to help it automatically perform functions that up to now have been executed manually, such as assuring data compliance with the number of regulatory agencies in the eleven states it serves. The system will ensure reliability and reduce outages in a network that consists of conductors (cables) made of different physical materials spanning varying topographies and differing climates.”
According to a press release from Principle Power, the Department of Energy (DOE) has given a US$3.6 million grant to a consortium of partners led by Principle Power including Akselos. SA, American Bureau of Shipping, University of California Berkeley and others. The funding will be used to develop, validate, and operate DIGIFLOAT, the world’s first digital twin software designed for floating offshore wind farms on the WindFloat Atlantic project.
Another recent press release announced Nation Grid was partnering with Utilidata and Sense to create a pilot project that is a first of-a-kind digital twin application. It’s a virtual model that will represent an “end-to-end image of their electric grid. It will be capable of mapping power flow, voltage, and infrastructure from the substation into the home. The goal is to demonstrate the value of real-time data across the grid.
Digital twinning is making inroads into the electric grid and that isn’t surprising. After all controlling the grid is all about data and being able to act on it. To paraphrase some experts, those failing to take advantage of digital twins will be left behind.