The power grid has seen substantial growth in smaller-scale renewables, such as household, community, and commercial solar and battery systems. The high volume of interconnected distributed energy resources (DERs) has prompted the energy industry to adopt control capabilities and standards to improve grid operations with smart inverters.
Many electric utilities are still working to understand the most beneficial operating settings to use on their respective distribution grids to address common high-voltage and fluctuation concerns created by the growing volume and density of DERs.
Accelerated Growth
Over the past decade, hundreds of thousands of solar power plants have been connected to the U.S. power grid at an annual growth rate of 24%, according to the Solar Energy Industries Association. The majority of generating capacity added to the grid each of the past four years is attributed to solar energy. Its share of total U.S. electrical generation grew from just 0.1% in 2010 to nearly 5% in 2024. The solar industry is expected to nearly triple in cumulative deployment by 2028, as the Inflation Reduction Act provides tax incentives that could spark nationwide demand. Despite the proliferation of solar energy in recent years, steady growth is still needed to underpin the clean energy transition and achieve the pace required for a net-zero energy sector.
The magnitude of solar power systems tied to the grid now and in the future presents a challenge. Because smaller-scale solar is typically not owned or managed by grid operators, these installations can make it difficult to control operating conditions on the distribution grid, which leads to system instability.
Maintaining Stability
Grid operators are responsible for ensuring their T&D systems operate reliably for the benefit of all energy consumers. For most of the past 100 years, large-scale energy generation — such as plants powered by coal, natural gas and nuclear fission — were located far from consumers. Electricity flowed only one way through T&D systems, from the centralized generation source to the end user. Today, the high penetration of DER installations on distribution networks affects power and voltage stability, making operations far more complex. DERs require the grid to handle two-way electricity flow, as these systems inject the excess power they generate back into the grid.
Smart Inverters
The electric system is designed to operate within certain voltage and power factor ranges. With increased availability of smart inverters to control the power conversion of DERs injected into the grid, the settings available can help to stabilize grid operations, compared to DERs simply operating at unity power factor. In response to a change in either system voltage or DER power magnitudes, smart inverters are configured to change their power output to support and maintain quality system operating ranges.
DER settings have the capability to improve grid operations, especially when distribution feeders have a high penetration of DERs. Anticipating the need to make DER devices fully compliant with modern grid support standards, EPRI and seven electric utilities collaborated on a study to identify which DER settings would benefit the distribution system in specific operational scenarios.
“For most circuits in the electrical distribution grid, a common set of DER settings is sufficient to ensure a safe and reliable grid,” said Ernest Palomino Sr., distribution planning engineer, Salt River Project. “As the amount of DERs increase on a particular circuit, the ability to provide power safely and reliably may diminish. Modifying DER settings could provide another solution to maintain grid integrity that currently is limited to infrastructure upgrades.”
To be representative of broad distribution conditions in multiple service territories, each participating utility provided 40 feeder circuits to be analyzed for potential DER additions and hosting capacity. Hosting capacity is the amount of DERs that can be accommodated without adversely impacting power quality and reliability on any given electric system. EPRI designed the project so results could be evaluated from feeders in operation, not just synthetic feeders in an academic study.
Broadly applying a relevant DER setting can be a good starting point before DER penetration becomes unmanageable. The goal of this EPRI project was to investigate which settings were most likely to improve overall grid behaviors when applied universally. This understanding can then serve as a springboard to customize DER settings for locations that need a different operational approach. Starting with a good default setting can reduce the need for in-depth engineering to adjust feeder operations or delay the need for capital investment to increase capacity.
Performing the analysis for so many feeders helped engineers to identify which settings can help the grid the most often for the most locations. It simplified the choice of which settings would be of maximum value, providing a significant statistical result rather than relying on findings from a few sample feeders whose performance may be too narrow to comprehensively represent the grid.
“Utilities throughout the U.S. are experiencing different levels of DER penetration. Participating in a study with a diverse set of circuit load types, circuit length, and DER penetration allowed the utility to understand the optimal DER settings for a variety of future scenarios,” Palomino noted.
Study Results
The DER functions simulated in the project incorporated a series of settings that were applied to the solar photovoltaic inverter, including volt-VAR, watt-VAR, volt-watt, volt-VAR combined with volt-watt, and constant power factor. In electric power T&D, reactive volt-ampere (VAR) is a unit of measure for reactive power, which is the energy circulating back and forth between the source and the load.
For individual feeder studies, displaying a feeder circuit as a heat map illustrates that various locations may benefit more or less from different DER settings. Considering the reactive power magnitude and behavior of each setting, the hosting capacity for many locations can improved most by using a constant, absorbing power factor setting. Results are dependent on the size and location of customer demand, any existing DERs, conductor sizes, and feeder devices like capacitors and voltage regulators. At a given location, compared to the best settings option, the hosting capacity performance of all settings can range from very similar behavior to dramatically reduced benefits.
To avoid the challenge of selecting location-dependent DER settings, researchers performed this individual feeder study on the entire set of feeder models and compiled the results. Data analysis on thousands of locations across 200 feeders enabled researchers to compare performance between DER settings, determine the best performing setting on average, and filter results by power flow or feeder characteristics. Key findings of the study were as follows:
- Smart inverter settings improved hosting capacity on average over unity power factor.
- Numeric scores provide context to balance hosting capacity with reactive power and curtailment.
- Volt-VAR offers the best combination of hosting capacity improvement while minimizing curtailment and reactive energy.
- Power factor offers most hosting capacity improvement at cost of always-on reactive power.
- Final scores are extremely dependent on chosen weights.
Real-World Application
Electric utilities are already using these study results to determine default settings to apply to upcoming DER interconnections, as well as coordinating the chosen setting with local utility commissions. The research indicates leaving DERs to operate in the old unity power factor default is not going to create the necessary grid benefits. The volt-VAR setting strikes a balance between excessive reactive power magnitude and the appropriate timing to use that reactive power. There are some locations where a targeted use of other settings is best, and this EPRI study shows how that can be applied on a case-by-case basis. This approach simplifies the effort needed to manage settings for all interconnected DERs.
Providing guidance to engineers tasked with applying smart inverter functions, this two-year project is the largest of its kind to date. Using EPRI’s smart inverter settings screening methodology, the project extracted key findings and final analytical results from more than 200 feeders across multiple utilities. The study provides data-backed recommendations for default settings utilities can take to their regulators, so that DERs can be integrated at a faster pace. Improving overall DER performance can drive stakeholder satisfaction, support operational performance and advance compliance with national clean energy goals.
Stephen Kerr ([email protected]), senior technical leader, DER grid impacts analysis at EPRI, has nearly 15 years of combined experience in utility engineering assignments and research activity. He was involved in electric customer construction, distribution planning, substation design, system protection and DER integration in previous roles at Arizona Public Service. Kerr currently performs and guides EPRI distribution feeder research analysis to understand the technical implications of autonomous and managed DERs, including smart inverter settings on feeder hosting capacity and coordination of smart inverter settings with grid-voltage regulation devices.
Devin Van Zandt ([email protected]), senior program manager, integration of distributed energy resources at EPRI, has more than 30 years of industry experience and is involved in distributed energy resource integration. His current focus is on improving the steady-state and dynamic modeling and simulation aspects of DER integration and understanding how autonomous and managed DER capabilities impact distribution systems.