The U.S. electric transmission and distribution system is among the most efficient in the world. Yet, according to the Energy Information Administration (EIA), about 5% of total generated electricity is lost each year during transmission and distribution. This translated to a loss of 207 million MWh in 2020 with a value of $6.21 billion based on an average wholesale electricity price of $30/MWh. At a time when fuel prices are consistently hitting new highs and electricity customers are struggling to deal with inflation on all fronts, are power companies doing enough to mitigate T&D losses?
A Pacific Northwest National Laboratory study reported that one-third of the total T&D losses are attributable to distribution transformers. The study projected that these losses would be reduced significantly by the adoption of higher efficiency standards, which became effective in 2016 for this class of transformers. No appreciable change occurred in estimated T&D losses during the period from 2016 to 2020, perhaps because the transformer turnover necessary to realize expected benefits occurs over a 20-30 year time frame. The DOE is presently in the process of completing another mandated six-year review of distribution transformer standards and may shortly propose amended requirements (see Docket EERE-2019-BT-STD-0018-0022). As before, any standard revisions likely will address only new equipment.
Utilities historically have been unable to financially justify comprehensive distribution transformer replacement programs based exclusively on efficiency improvement. The same applies regarding transmission and distribution lines where a complex number of design and installation parameters affect conductor, dielectric, reactive current, and sheath losses. Aggregate losses are greatest for distribution networks and in contrast with the trend of undergrounding distribution systems for reliability reasons, studies show underground cable losses are often higher than overhead conductor loses.
Regulatory policies in many states complicate the utility efficiency investment picture by imposing energy efficiency resource standards that only allow end-use efficiency to count toward program targets. The result, while positive in some respects, is resources are focused on helping customers implement energy efficiency initiatives, while no incentives exist for transmission and distribution investments, which could have the same or greater impact of reducing electrical losses, and accordingly, energy use.
Improving T&D Efficiency
Despite the regulatory situation and the magnitude of the challenge of replacing legacy infrastructure with the highest electrical losses (distribution conductors and transformers), utilities are doing quite a bit to improve T&D efficiency, including factoring loss reduction into the design of new capacity and reliability investments being made for other reasons. This includes distribution capacitor installation, conservation voltage reduction, phase balancing, upgrading voltage class, and even targeted transformer and conductor upgrades. In addition, utility demand response, interruptible loads and similar programs that reduce system loading during peak periods improve efficiency and reduce losses. Further, circuit sensors, AMI and system modeling and analytic tools help planners determine with real data where losses are greatest and what effect circuit modifications will have.
ETAP, IBM, Shaw Power Technology, Inc., Siemens, ABB, MilSoft, PSE and many additional firms offer power system design and simulation tools and services. National Rural Electric Cooperative Association (NRECA) members employ an additional Excel-based tool from Leidos that helps cooperatives evaluate losses, analyze costs and benefits, and identify optimized loss-mitigation techniques. The software helps users methodically dissect demand and energy loss components on network feeders; factor in cost of losses and reduction options using discounted cash flow assessment; and determine total ownership costs and benefit to cost ratios of loss reduction techniques. Study results may lead cooperatives to implement phase (load) balancing; manual or automated feeder reconfiguration; voltage optimization; power factor correction with capacitor banks; upgrading primary secondary and service sizing; adding feeders; upgrading voltage class; or substation and distribution transformer changeouts.
More utilities are moving to mega solutions to manage a full range of issues on their distribution networks, including loss reduction, power quality, reliability, and resiliency. Advanced distribution management system (ADMS) software platforms use digital sensors and switches with advanced control and communication technologies to provide phase balancing; volt/volt-ampere reactive (volt/VAR) optimization; conservation voltage reduction; peak demand management; and automated fault location, isolation, and service restoration.
The passage of FERC Orders 841 and 2222 may be contributing to a decision by some utilities to install another grid management mega solution: distributed energy resource management systems (DERMs) which assist with distribution system optimization and the integration of DERs. DERMs help utilities control grid-edge conditions such as local over/under voltage, frequency and loading issues, and increased intermittency. DERM advocates believe DERs working in harmony with utility resources can help network operators optimize system performance and loss reduction. Utilities are increasingly making decisions that contribute to loss reduction from system design to equipment turnover and operations improvements. It will be so interesting to see if the recent pattern of 5% T&D losses starts to improve more quickly.