Utilities demonstrate that voltage optimization saves energy, lowers demand and reduces reactive power needs with no negative customer impacts.
Distribution system efficiency (DSE) is one of the most energized subjects in the electric utility industry today. The distribution system actually can be considered an untapped, low-cost energy resource. As a result, utilities are beginning to incorporate efficiency measures into their overall planning and operation strategies.
This is the first in a series of T&D World articles that will focus on the experiences and knowledge of utilities that are adapting their electric systems to be more efficient. This begins with the DSE findings of the Northwest Energy Efficiency Alliance (NEEA) Distribution Efficiency Initiative (DEI) project completed in 2007.
NEEA's Distribution Efficiency Initiative
The three-year DEI research project studied and evaluated cost-effective methods of implementing DSE and voltage optimization (VO) on distribution systems. The project quantified the relationship between real energy kilowatt-hour consumption, kilowatt demand, reactive power (kilovar) demand and applied average voltage. The project team used various distribution design and operational techniques to optimize distribution system performance to achieve cost-effective energy and demand reductions.
The DEI study consisted of two independent projects: a load research project and a pilot demonstration project. The DEI project deliverables included a final report, a set of engineering software tools and a DEI Application Guidebook to help utilities determine cost-effective system improvements to achieve greater DSE.
The overall results of the study conclusively show that operating a utility distribution system in the lower half of the American National Standards Institute (ANSI) C84.1 standards' acceptable voltage range of 120 V to 114 V saves energy, lowers demand and reduces reactive power requirements without negatively impacting customer service. By implementing both DSE and VO in concert, utilities can cost effectively achieve a 1% to 3% energy kilowatt-hour reduction, a 2% to 4% reduction in kilowatt demand, and a 4% to 10% reduction in reactive power demand.
Computer models demonstrate that 10% to 20% of the energy savings can occur on the utility side of the meter. The cost per kilowatt-hour saved for DSE is more than VO; however, the life-cycle cost for a 1.5% to 2.5% energy reduction is less than US$15/MWh (15 mills/kWh) saved when combining both DSE and VO into a single application improvement. Nearly 80% of the savings can be achieved for less than $8/MWh (8 mills/kWh) saved.
The results of the study show that with cost-effective application of DSE, greater VO savings are possible. The potential energy savings and life-cycle costs utilities can achieve for different combinations of DSE and VO can reach 3%. In the study, DSE improvement options ranged from none to minor (phase balancing, reconfiguring and capacitor additions) to major (phase upgrades, regulator additions and reconductoring). The VO voltage-regulation options included either line-drop compensation or end-of-line voltage-feedback control.
The load research project was designed to quantify the savings for specific end-use load strata. The project team collected data from 395 residential homes at 11 utilities across the Pacific Northwest with the goal of achieving sufficient sample sizes for each stratum, such as temperature climate zones, heat sources and air conditioning. To vary the voltage level at the end-use service meter, a home voltage-regulator device manufactured by MicroPlanet was installed at the meter base.
The project specifications required having a 115.5 V controlled voltage one day and a normal utility voltage the next day, alternating with an on/off sequence for one year. The home voltage regulator was modified to meet the project requirements by adding an automatic timer and was tested for UL certification. MicroPlanet worked with the project team to provide training for installing the units.
Recording meters were installed at the service meter, and the project team recorded data, including average 15-minute watts, reactive volt-amperes and voltage. Landis+Gyr Altimus meters were used to gather this load research data. The meter automatically uploaded data to MeterSmarts' Web site through a connection with the customers' phone lines. The project team and participating utilities and customers could access the meter data information with secure login procedures throughout the project.
One of the unexpected results of the study was the response of reactive power demand to voltage. Study results show the ratio of the percent change in annual real energy (kilowatt-hour) to the percent change in average voltage varies quite a bit with most data points falling below 1-to-1. However, the ratio of the percent change in reactive power demand to the percent change in average voltage exhibits a similar variability. The difference is that almost all measurements are well above 1-to-1. Thus, the leverage of VO on reactive power is greater than might be assumed from examination of the real power relationships. Historically, the electric utility industry has termed this relationship as the conservation voltage regulation factor (CVRf).
The pilot demonstration project included six utilities, 10 substations and 31 distribution feeders. The project team identified distribution system enhancements to improve phase balancing, reactive power management, areas with voltage below ANSI C84.1 standards and feeder metering capability. Substation metering was installed on each phase of each feeder to record watts, volt-ampere reactive and voltage averages over 15-minute periods.
While the project team identified system improvements, many of the recommendations were not implemented because of time, work force or budget constraints. The voltage level at the substation level was lowered (optimized) for one day and set to higher normal settings the second day, alternating days on/off for a year. The automatic switching of the voltage control was accomplished in several ways:
A stand-alone computer to control the load tap changer (LTC) controller
Supervisory control and data acquisition (SCADA) to close a contact on the LTC controller that reduced the voltage
SCADA to communicate directly with the LTC controller to write settings to it
PCS UtiliData's AdaptiVolt closed-loop end-of-line voltage-feedback control system.
The NEEA DEI project achieved greater than 2% energy saved for both the load research and the pilot demonstration projects. The total energy saved was 1.88 MW annually. The load research project achieved an annualized 345 kWh of savings per home, and the pilot demonstration project saved 8476 MWh across all 31 feeders during the year of data recording. The algorithms used to calculate the ÄE, ÄV and the CVRf are weighted averages and were normalized for temperature. The minimum covariance determinant and the L1 median were used for minimizing the effects of outliers on the results, thus a simple multiplication of the ÄV and the CVRf does not exactly equal the reported energy savings.
The results from the pilot demonstration project showed the CVRf for energy was approximately 0.7 for each 1.0% average voltage reduction, and the CVRf for kilowatt demand and reactive power kilovar demand was about 1.2 times and 4.0 times higher, respectively, than the CVRf for energy. The pilot demonstration projects that achieved higher CVRf also implemented recommended system improvements or the feeders were already operating efficiently, such as through high power factors, low voltage drops and balanced load.
The DEI results show that cost-effective energy savings can be achieved for less than $22/MWh over a 15-year life. These savings are believed to have a much longer life, which yield even lower life-cycle costs. These costs include annual maintenance costs. Seven of the nine pilot demonstration projects have a benefit-to-cost ratio greater than five at a cost per kilowatt-hour saved of less than 5 mills using net present value over 15 years.
Designing More Efficient Distribution Systems
The DEI study was designed to yield verifiable energy savings without impacting customer service. It shows that with lower voltage drop designs, the backup reliability of the system increases. For most of the electric utilities involved in the project, available distribution system characteristics were limited (i.e., historical feeder load data, system primary line modeling, maximum primary and secondary voltage drops, distribution transformer and secondary conductor characteristics, detailed customer load data, and so forth).
As a result, design assumptions were made with conservative to moderate risk to the customer, yielding less voltage reduction than would otherwise have been possible.
For example, the study assumed a maximum voltage drop from the primary conductor to the customer's meter estimated at 4 V (on a 120-V base) for peak and off-peak load periods. This assumption was more than adequate, but at times, it was too low for specific customer connections, requiring the raising of the voltage levels somewhat. In addition, this assumption does not take advantage of the reduced voltage drop for off-peak load periods.
The study shows that electric utilities need to do a better job of modeling (primary and secondary systems), metering (substations and customers) and performing engineering assessments of their distribution systems. By doing so, electric distribution systems can be designed to further reduce the average voltage by an additional 1% to 2% above those reported in the DEI study. With expanded geographic information systems and advanced substation and customer metering information systems, a highly cost-effective efficient distribution system can be achieved with a greater degree of confidence.
In general, distribution systems are deemed efficient if the power factor is maintained at or near unity, the conductor loading is less than half of its maximum load capability during peak loading periods, and the combined primary and secondary maximum voltage drops are less than 6 V (on a 120-V base). With the integration of newer engineering technologies, the distribution system planners will have the ability to determine the voltage drops at all points in the distribution system and to cost effectively design the system for higher efficiency.
With voltage data available throughout the distribution system, areas that have the lowest voltage levels may be easily identified and improved, thereby allowing the entire feeder/substation voltage to be optimally lowered further. The exact amount of DSE savings available will vary by utility to the extent system characteristics are known, as well as to the extent of available engineering analysis tools and modeling, and existing system capacity limitations.
|Type of improvement and cost per substation||Energy savings (%)||Kilowatt demand reduction (%)||Kilovar demand reduction (%)||Cost in mills ($0.001/kWh)|
|Voltage regulation LDC* US$15,000 to $25,000||0.5 to 1.0||1.0 to 2.0||3.0 to 5.0||0.1 to 2.0|
|Voltage regulation LDC with minor system improvements $40,000 to $60,000||1.0 to 2.0||2.0 to 2.5||3.0 to 7.0||2.0 to 8.0|
|Voltage regulation LDC with major system improvements $80,000 to $100,000||1.5 to 2.5||2.5 to 3.0||5.0 to 10.0||10.0 to 15.0|
|Voltage regulation EOL with major system improvements $100,000 to $350,000**||2.0 to 3.0||3.0 to 3.5||10.0 to 20.0||to 50.0|
|The costs and benefits are for performing system improvements at the substation and feeder level, which are representative of the pilot demonstration projects.|
|* LDC is line drop compensation and EOL is end-of- line voltage methods.|
|** Globally applies to the Northwest region of the United States. By matching specific substations and voltage regulating methods, the cost could be reduced to 5 mills to 30 mills, but the total energy saving also would be reduced.|
|Project||Voltage reduction (ÄV)||CVRf* (%ÄE/%ÄV)||Project energy savings (MWh)**||Percent energy savings|
|Load research||5.2 V (4.3%)||0.569||87||2.15%|
|Pilot demonstration* * *||3.03 V (2.5%)||0.69||8,476||2.07%|
|* The CVRf is based on random selection of residential sites, weighted to represent expected results for the Pacific Northwest region of the United States.|
|** Actual energy savings for the DEI project (not annualized).|
|*** Values are shown for the purpose of calculating the savings for the project and do not represent expected values for the Pacific Northwest region.|
|Substation||Benefit-cost ratio||Cost per MWh ($0.001/kWh)|
|Francis and Cedar||1.67||21.79|
|Costs and benefits are a 15-year net present value.|
K.C. Fagen (email@example.com) is a senior project manager for R.W. Beck, a SAIC Co., with 19 years of experience in distribution system planning, design and control systems. He was the project manager for the DEI project and is supporting EPRI on similar projects. In addition, Fagen is helping the Bonneville Power Administration expand its distribution efficiency program including voltage optimization. Fagen is a registered professional engineer.
Distribution Efficiency Initiative
Robert H. Fletcher (firstname.lastname@example.org) is the general manager of Utility Planning Solutions PLLC, based in Everett, Washington, U.S. He has more than 40 years of electric utility industry experience as an electrical engineer and obtained BSEE and MSEE degrees and a Ph.D. in electrical engineering from the University of Washington. He performs transmission and distribution system long-range planning for Northwest electric utilities, including small-area load forecasting and electric system capital planning. Fletcher is a registered professional engineer.
Editor's note: This article is the first in a four-part series on distribution efficiency and voltage optimization. The second article will be published in the May 2010 issue of T&D World.
Companies mentioned in the article:
This project was conducted by the Northwest Energy Efficiency Alliance (NEEA) and completed in 2007. Funding was by Northwest Utilities, the Energy Trust of Oregon and the Bonneville Power Administration. NEEA is a private, non-profit organization addressing energy efficiency in homes, businesses and industry.
The Distribution Efficienty Initiative project involved R.W. Beck (a SAIC Co.), RLW Analytics, Auriga Inc. and MeterSmart. The following 13 Pacific Northwest electric utilities also participated in this project:
|Clark Public Utilities||Pilot|
|Douglas PUD||Load research and pilot|
|Eugene Water & Electric Board||Load research|
|Franklin PUD||Load research|
|Hood River||Load research|
|Idaho Falls Power||Load research|
|Idaho Power||Load research and pilot|
|Portland General Electric||Load research|
|Puget Sound Energy||Load research and pilot|
|Skamania PUD||Load research|
|Snohomish PUD||Load research and pilot|
|The DEI final report and engineering tools are available at www.rwbeck.com/neea.|
Auriga Inc. www.aurigacorp.com
Bonneville Power Administration www.bpa.gov
Energy Trust of Oregon www.energytrust.org
Northwest Energy Efficiency Alliance www.nwalliance.org
Northwest Utilities www.northwestutilities.com
PCS UtiliData www.PCSUtiliData.com
RLW Analytics www.kema.com
R.W. Beck, an SAIC Co. www.rwbeck.com