Where would we be without servers?
To most people, Sept. 4, 1882, is not much more than a blip on the radar. However, in the 138 years between then and now, we have gone from Thomas Alva Edison's Pearl Street Station, serving 82 customers with 400 lamps, to the complex, interconnected, multivoltage engineering marvel known as the modern power grid — the engineering marvel of the 20th century. And wow, have we benefited.
Electricity empowers us. With it, we have commerce, communication, healthcare, controlled environments, food, lighting — and the list goes on. Even the briefest of power outages are enough to jarringly remind us how important electricity is across every aspect of our lives. Keeping the behemoth — that is the modern power grid — available and resilient, and thus able to minimize potential outages, is no small task. Equipment, coordination, predictions, and yes, software, all come together to create a resilient, reliable grid that benefits us all.
Enter control systems
As the grid has grown in complexity, the measures needed to keep it operating and stable have evolved as well. Over time, we have gone from actual knobs and dials to complex algorithms, digital models, and forecasts. A sea of data from a myriad of servers, zapping back and forth along the grid every few seconds. Somewhere deep within the utilities' control centers, a SCADA, or energy management system (EMS), consumes this data, processes the information, performs computations and predictions, and then sends new parameters back out into the grid. The time involved is usually measured in seconds and the benefits to all of us are, of course, extreme. But can servers in control systems wind up making the world go...well, slower?
Without the right strategy in place, yes. When we think of the word 'server,' we automatically think fast! Powerful! Expensive! Because servers themselves are not slow. But it's the time involved in the timing of data transfers across the control system and its components that can cause delays and performance issues. For example, data is collected in the field, packaged into a format for transport, then shipped over to the control center every 2 to 4 secs or so. Once there, the server takes over, opening the package, processing the data, and storing it off for nearly immediate use by many other applications and services. Here, the calculations are done and the settings determined. Once shipped back to the field, often seconds or minutes later, the devices only then learn of their new operating parameters.
So is this a big deal today? Not yet. Will it ever be? Yes.
Think of it this way. If you are reading this, you probably are of an age group that can still remember the ancient times of going to a library. The library was a little bit like the server we keep referring to — the centralized place where data went to and came from. When you wanted a book, you went there. Using a well-known modern marvel known as the Dewey Decimal System, created in 1876, you could navigate and find topics in a jiffy. Somewhat like finding and buying a book today, right?
But today when you want a book, you do one search and get thousands of results. A particular book may be 3 miles down the road at the library or on the other side of the country, but it doesn't much matter to you — you get the one you want, as you want, when you want. The confines are no longer dictated by one spot — the world of books has been truly distributed.
Right around the electricity corner, we still have servers, but they do different things — and this is a pretty standard statement since things have been changing fast. Not too long ago, there were very few generation stations, but there were also very few users. Fast forward a bit and we have millions of users and thousands of generation stations. The new FERC Order 2222 pretty much guarantees it.
Distributed generation (DG), renewable energy, and distributed energy resources (DERs) are definitely taking the ground of the large, centralized plants that we have grown accustomed to. Today, the huge, heavy, spinning rotors have changes that happen over seconds/minutes/hours. In the new DER world, everything changes. There are thousands of lightweight, distributed devices sending information in real time. Everything is starting to look like a solid-state world — one with no moving parts.
Doing server things in the future
The path from large centralized plants to thousands of connected small generation points will be long, regionalized, and different. In this new grid, there will be physical connection points where energy flows, and there will also be digital connections where data flows. Today, and in making a hasty generalization, the grid could operate with just the physical connection — also known as the 'Grid Following' mode.
In the Grid Following world, devices are capable of generating and contributing to the grid, so long as the grid is there to provide coordination information. If the grid goes away during an outage, these devices trip offline and wait for the grid to come back. This presents a multitude of issues that utilities are facing today, but that is scope for another time.
In our near future, the world will continue moving to the data-centric Grid Forming state — relying on the digital communication between thousands of devices to perform grid functions. This is nearer to the solid-state of the grid where the devices/inverters are communicating and controlling what the grid looks like. Hundreds or thousands of messages will be transmitted every second among the devices. In this world, the trip, as it is today, from the device to the server at the utility and back, will take an eternity.
So where did servers go?
Nowhere. They are still there doing their server-like things. Today, they are doing the same job but are most surprisingly unaware of many of the changes that are going on in the field — but it won't stay that way.
Each day, we all expect to hear more and more about renewable energy goals and edicts. We also expect to hear more on the economics and practices of retiring fossil fuel plants — favoring renewables/DERs continues. Because of these two facts, the life of the server will continue to change. It, too, will evolve.
Not so much taken but reassigned
Servers are a key piece of our existing engineering marvel and their role will continue. What we have already seen in closely related, highly autonomous environments, is that the role of the server shifts but does not go away. The server in the modern/future utility control system takes on a role similar to that of your favorite book-buying website: It is a clearing house of the information and whereabouts but does not ship the book. Instead, it sends the desired outcome to the participating locations, and they decide and react accordingly.
This level of autonomy is not new. Many people are surprised at the levels of autonomous operations currently found in other markets — everything from self-driving or self-flying vehicles, to massive mining trucks to robots that float around in the ocean, shooting lasers at salmon to pick off tiny lice. The Object Management Group's (OMG) Data Distribution Service (DDS) standard is one of the premier ways to accomplish these new operational challenges. By distributing the computing power and in essence making an environment of devices/applications connected to a software databus, information (topics, in DDS lingo) is exchanged between different devices and disparate systems in real time. And it's done securely and without the latency or risk of relying on one specific server.
Welcome to the grid of the future. I'd like to think Thomas Edison would be proud.