Risk management

Better Decisions: A Two-Way Street

Engineers in the oil and gas industry make tough decisions for a wide variety of issues, including risk and safety, and about design and other types of tradeoffs, as well as operational assessments.

Better Decisions: A Two-Way Street

Engineers in the oil and gas industry make tough decisions for a wide variety of issues, including risk and safety, and about design and other types of tradeoffs, as well as operational assessments. They have to make sense of complex situations, often in the face of limited and even inconsistent data. They not only must plan, but also replan—modifying a plan in action with uncertain consequences for any changes that might be introduced.

To improve the quality of their decisions, engineers have to do two things. The first is to reduce mistakes, which are often visible signs of poor decision making. And because mistakes can be costly, they may trigger investigations and severe consequences. As a result, engineers face pressure to adopt strategies to cut down on errors, if not eliminate them. They are encouraged, if not required, to follow procedures, to use checklists, to document assumptions, and to identify areas of uncertainty. They are directed to use analytical tools instead of intuition, because intuition can lead to errors. Engineers are not alone in facing pressures to reduce mistakes; other industries, such as aviation and health care, have also traveled this road.

The second is increasing expertise and insights. Engineers need to be smarter, pick up on weak signals more quickly, and detect patterns and spot implications. 

However, neither making fewer mistakes nor having more expertise and insights is sufficient. Following rules and procedures might work in well-ordered situations, but in complex settings, such as those that oil and gas engineers often face, experience is needed  to know which rules to follow, how to adapt the rules, when the conditions are met for triggering a rule, and so forth. Although powerful analytical tools have immense benefit, they have to be configured, and someone with experience needs to judge if the results are plausible. A checklist of risks helps in evaluating a course of action, but in an operational setting, expertise and insights are needed to anticipate how a combination of events might create a hazard that is not on the list.

Unfortunately, an overemphasis on reducing mistakes can get in the way of building expertise and forming insights. When we try to replace expertise with more complex sets of rules and procedures, we often wind up diminishing the chances for gaining expertise. When we require people to document assumptions and follow analytical methods, we often wind up interfering with insights. This is not to say that we should eliminate procedures, analytical methods, or attempts to reduce errors. Rather, I am arguing for a balance. We need to do both: reduce errors, and increase expertise and insights. Preoccupation with the first—with the steps to cut down on errors—can become counter-productive.

Consider an example in which the data were clear-cut. An operating company reported an incident during a waterflood system startup. An outside operator alerted the control room operator (CRO) by radio that “it sounds like the booster pump is pumping marbles.” The CRO knew immediately that the pump was cavitating. Cavitation is caused by gas flashing at the suction of the pump. The most likely cause of cavitation in many systems is low liquid level in the suction vessel upstream of the pump.

The CRO quickly checked the level in the vessel. There were two level transmitters and both were reading at normal liquid level, but the readings were different. The CRO instructed the outside operator to manually check the liquid level. The outside operator, who had anticipated the request, was already reading the level gauge. It showed normal liquid level.

Although all the data lined up to indicate normal liquid level in the suction vessel upstream of the pump, the CRO thought that the data could not be right and had taken steps to rapidly increase the liquid level, resulting in cessation of the cavitation. Soon after this action, the outside operator found that a closed valve had isolated one of the level control transmitters and the level gauge. He opened the valve and confirmed low, but rapidly increasing liquid level. Later, it was determined that the second transmitter had failed.

In this situation, the objective data were wrong. Although the design provided for measurement redundancy with two liquid level transmitters, neither device gave an accurate reading.

The operators’ responses could not have been easily codified in procedures. It was not procedures that guided the operators in this case. It was expertise. Both immediately knew the pump was cavitating. Both “knew” that the liquid level in the suction vessel was low, despite instrument readings to the contrary. Both believed that the pump could be allowed to cavitate for a short period of time without significant damage while they corrected the problem.

This incident shows that making a good decision is not about following rules and guidelines. The rules, guidelines, and procedures are valuable in many ways. However, expertise cannot be bottled or captured as a set of rules. This is especially true in complex settings, where many things are occurring at the same time.

When organizations try to create comprehensive procedures in complex settings, they typically run into trouble. The related processes are never static. New equipment is being introduced, some components are undergoing maintenance, and other items are showing problems that may be transient signals or early signs of a potential breakdown. Equipment may fail without warning, or a closed valve may result in an incorrect reading, as happened in the pump incident. The operators have to take all factors into account.

In addition, the procedure guides are continually being updated (both formally and informally) as workers discover flaws and limitations. As a result, workers usually are unsure which version of the procedure guide is current, whether colleagues are using the same version, or if a procedure has been revised. Different operators will deviate from the standard procedures in various ways depending on their own expertise and experience.

It is easy to downplay the kinds of expertise needed to run a complex operation or to manage an emergency such as the cavitating pump, because these aspects of expertise are based on tacit knowledge—knowledge that is hard to put into words because it depends on subtle cues, on a sense of typicality that warns us of anomalies, and on experts’ beliefs about the interplay of causes to explain important effects.

This leads to another area of confusion—the nature of the decision-making process itself. Most of us have been taught that the best way to make decisions is to systematically identify all the potential options, and to evaluate each one using a common set of evaluation criteria to see which option comes out on top.

However, this strategy has several problems. First, it is not how people actually make decisions. Second, there is no evidence that directing people to follow the strategy will improve their performance. Third, the strategy ignores expertise. It ignores the tacit knowledge that enables people such as the CRO in the pump emergency example to rapidly size up situations and make good decisions—and that may explain the first two problems. Because the systematic strategy ignores expertise, people may not be using it, and when directed to use it, their performance does not improve (and may even suffer).

In the past 25 years, researchers have discovered how people make decisions, particularly in complex and uncertain situations. People are able to use their experience to identify the dynamics of a situation through a pattern-matching activity, matching the situation they face to the patterns they have stored in memory over decades of experience.

The pattern-matching provides an option that is likely to get the job done. To evaluate that option, people use a process of mental simulation. They imagine what would happen if they put the option into action. If they do not find any problems, they are ready to act. If they spot small problems, they adjust the option. If they see insurmountable problems, they jettison the option and look at the next one in their repertoire.

That is what the CRO did. His pattern repertoire immediately told him “cavitating pump,” which calls for increasing the liquid level. Even though the transmitters showed normal liquid level, the CRO “knew” better. He did not go through a ritual of generating alternative actions and comparing them with each other.

In some situations, it does make sense to compare different hypotheses about what might be happening. This was not one of those situations. By taking advantage of experience, people are able to escape the paralysis-by-analysis trap. They are ready to handle complex challenges.

Decision performance would improve if the oil and gas industry had greater respect for expertise and for the tacit knowledge of the skilled workers and supervisors. More attention and efforts should be paid to developing expertise and supporting insights, rather than trying to represent decision making as rules or checklists, or by trying to eliminate it by using analytical methods.

Effective decisions rely on expertise and deliberate analyses, on patterns for quickly recognizing the problem, and on careful scrutiny in detecting errors. By balancing these forces, the reduction of errors, and the growth of expertise and insights, organizations should be able to significantly improve performance.


Gary Klein, PhD, is a cognitive psychologist who helped found the field of naturalistic decision making.

His latest book is Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making.