Following the August 2003 North American blackout, independent system operators, qualified scheduling entities, and transmission and distribution service providers have an increased focus on their operators' situation awareness and the usability of their tools.
In accordance with the North American Electric Reliability Corp. guidelines on situation awareness, and as part of its transition to a nodal market, the Electric Reliability Council of Texas (ERCOT) underwent a focused effort to evaluate and improve operator systems in its control room from a human factors perspective.
Control room operators, business owners, operations trainers, power systems engineers and human factors professionals collaborated on this process for several years. The control room operators' continuous training shifts provided an excellent opportunity for ongoing feedback on needs assessments, prototyping and refinement, allowing rapid iterative cycles.
Several of the prototypes developed as part of this effort turned into production-grade applications, such as the Macomber Map and Data Integrated Viewer. During this process, several issues — such as the need for operator input, information as opposed to data and ease of navigability — became clear and were consistent with findings from other industries. By addressing these issues, work at ERCOT proceeded far more quickly and efficiently than it would have through traditional, more isolated development and production efforts. There also are reports of increased end-user satisfaction.
Often, software and system work is done by development groups, either within an organization or from a vendor, and the end users do not have an opportunity to see or interact with the software or system before its deployment. This can lead to user frustration because of the mismatch between the way the operator needs the system to work and the use case that formed the basis of development. That can result in the operators having reduced trust in the system, forcing them to work at a higher stress level.
By taking advantage of the training cycle, human factors professionals were able to work shoulder to shoulder with ERCOT operators to identify the use cases, receive feedback early and often on the design, answer questions and help resolve trust issues.
Need Information, Not Data
Vendors and other internal development groups often build systems by starting with the core architecture and working outward. When it comes to user interfaces, many of these displays are built by the same individuals who wrote the core systems and who, therefore, are focused on displaying a large amount of data. For them, like their operational/support engineering counterparts, these displays provide valuable diagnostic and debugging data, and are crucial for ensuring a system's continuous maintenance.
However, control room operators need displays with very specific data that provides situation awareness, or a high-level view, of the state of the system and data to identify and resolve reliability problems. This is very different from system diagnostic problems. Presenting operators who have real-time pressure and decision-making responsibility with too much data, especially low-level and non-task-relevant data, creates additional stress and forces them to waste valuable time reading through it for understanding. The more they are forced to sift through data, the more they have to rely on their memories for different pieces of needed information. Couple that with stress, and the operators are at greater risk for making errors.
At ERCOT, working with the operators to understand their situation awareness needs and the data and methods they use to resolve problems allowed the development of a system screen design that resulted in increased accuracy and performance.
Architecture Influences Interface
For many products, especially those with a legacy of code dating back one or more decades, the architecture of a system becomes a core component in the minds of its software developers. Problems arise and developers come up with solutions commensurate with the architecture. When it comes time for user-interface development, this same architecture-based thinking can be a hindrance. Most of the time, end users do not know their software's core architectures and, therefore, have trouble navigating the software.
Instead, a human factors approach encourages thinking about the users' needs, goals and likely paths. If, for example, control room operators are looking at transmission lines, a system that can visualize telemetry and estimated flows (which traditionally are in two different subsystems) will have more useful information than if the operators had to look at two separate screens. As an example, if a telemetered and state-estimated megawatt value differs by 500, that is an indication of a potential problem in one or both systems.
Similarly, this thinking can occur at the level of variables retrieved. For example, one key question for control room operators is about line-loading status, which can easily be computed, but core systems are likely to store data at a lower level. For example, a line can have telemetered actual megawatt and megavolt-ampere-reactive loadings, as well as ratings for that line. However, a quick computation that compares these quantities can yield more useful information (for example, the line has surpassed its 2-hour rating, is now at 98% of its 15-minute rating and has been above the 2-hour rating for 5 minutes).
The core data is present within the system, but some simple manipulations save the operators an additional step, which in turn gives them additional cognitive processing time for addressing other issues.
Operators Are Not Computers
Control room operators are presented with a deluge of information, which they try to process as best they can to ensure the systems are functioning properly. However, people are susceptible to some factors that should be taken into account when evaluating a system's effectiveness:
Humans tend to function and make the best decisions at an intermediate level of stress. When a person's stress level is too low, their attention suffers, and they have difficulty pushing and storing information into memory. When the stress level is too high, they are too easily distracted and tend to make decisions quickly without evaluating the breadth of information available. By providing the information needed to formulate a correct and current understanding of the state of the system, with easy navigation to related details to help support decision making, operators can respond in high-stress situations more easily.
Many user interfaces encode information by using color alone (for example, red = bad, green = good). However, ERCOT's team has recommended an additional piece of information (for example, a shape, such as a stop sign or a circle) to show this information. That way, an operator with color blindness is not at a disadvantage. Furthermore, there is evidence that those who use tobacco products may have decreased color sensitivity, leading to potential confusion when color is the only means of encoding.
Humans are challenged with two memory issues. For short-term memory, keeping track of system values as they change over a few minutes to hours can be challenging, especially with the high level of activity in the control room. By providing trending data for key indicators, operators can see patterns in the system rather than being expected to remember them. Humans also can be challenged to pull information out of memory such as the exact name for a line or station (especially when they are cryptic). Having a strong search engine that allows progressive as well as alternate strategies for searching can help support operators.
- Information overload
Humans can only process a limited amount of information at a time. It is important to help operators focus on the important and relevant information. For example, alarms are intended to draw the attention of an operator to a critical situation, but sometimes the sheer volume of alarms is overwhelming. They need to be organized in a way that allows the operator to first recognize what is important but also to be able to understand what the problem is that caused the alarm. In ERCOT's system, alarms are identified by priority, but they also can be grouped by region, topology and element. This gives the operator greater flexibility in understanding alarms and the relationships between them, when relevant.
In order to address the issues discussed here, ERCOT's user-interface design goals were as follows:
Keeping the big-picture information always available (situation awareness)
Keeping the operator's attention focused on critical information
Designing and organizing user-interface layouts around operator goals
Supporting the operator by integrating data from different sources and presenting calculated values
Providing trending information to reduce operator memory loads
Allowing an operator to control the amount of information detail displayed to him or her
Providing flexible dynamic searching tools.
These design goals were incorporated in prototypes that later became production-grade systems, such as the Macomber Map, which allows operators to see detailed information against physical components on the Texas map. Named in memory of the late ERCOT Enterprise Architect Gary Macomber, the map visualizes data that, historically, had been viewed in spreadsheet form.
Macomber Map integrates information across systems so operators are better able to see the high-level status of systems and regions while improving their ability to drill down into the data they need on demand. ERCOT uses the map for real-time operations, training and planning, and it is the interface to North American Electric Reliability Corp., the Federal Energy Regulatory Commission and the Texas Reliability Entity Inc. to show ERCOT's overall status. Macomber Map was credited as a core tool assisting operators in maintaining grid stability during record-setting wind levels in early 2010.
This work would never have been possible without the confidence placed in us by Gary Macomber, John Adams, Bill Blevins, Murali Boddeti, Trip Doggett, Colleen Frosch, Theresa Gage, Jimmy Hartmann, Cagle Lowe, Joel Mickey, Juliana Morehead, Dottie Roark, Kent Saathoff, and ERCOT's managers, directors, and EMMS production support and development groups. Thanks to Martha Collins, Martha Siebold and Kate Horne for their support as well.
Michael E. Legatt (email@example.com) is the principal human factors engineer for the Electric Reliability Council of Texas (ERCOT), which manages the flow of electricity to 22 million Texas customers. He holds a Ph.D. in clinical health psychology/neuropsychology from the Ferkauf Graduate School of Psychology/Albert Einstein College of Medicine. As an amateur (ham) radio operator, he received a commendation for helping to provide emergency communications during the 2003 blackout in the northeastern United States, which sparked his interest in the psychology of energy management. He works to build systems designed to provide operators with needed information, optimizing for perception, speed, comprehension and stress management. Legatt is currently pursuing a graduate degree in energy systems engineering at the University of Texas at Austin.
Marianne Clark (firstname.lastname@example.org) is a human factors engineer at Scientific Research Corp. Clark holds a Ph.D. in cognition, learning and instruction from the University of Texas at Austin. Clark has 20 years experience applying cognitive psychology and human factors to the design of systems in areas such as cognitive workload, situation awareness, human computer interfaces and training systems.
Electric Reliability Council of Texas Inc. www.ercot.com
Federal Energy Regulatory Commission www.ferc.gov
North American Electric Reliability Corp. www.nerc.com
Texas Reliability Entity Inc. www.texasre.org