Skip to content Skip to Programs Navigation

News & Stories

Out of sight, out of mind

deepwater-horizonThe date is April 20, 2010. Seven BP executives board a helicopter and fly 40 miles off the coast of Louisiana to visit the Deepwater Horizon oil rig. The reason for their visit? They’re presenting a special safety award to the crew. Two hours after the presentation, two explosions rip through the rig – 11 people are killed and 4 million barrels of oil spill into the Gulf of Mexico.

How does such a thing happen?

To find out, the National Academy of Engineering and the National Research Council composed a committee in August 2010. Dave Hofmann – Hugh L. McColl Distinguished Professor of Organizational Behavior – is expert on safety culture and was asked to serve on it, along with geophysicists, petroleum engineers, marine systems experts, risk analysts and others.

Hofmann is good at figuring out how much a company values safety. His work includes talking to CEOs about their safety standards, then interviewing and surveying employees to see if what upper management preaches is actually being practiced.

“Every organization operating in a high-risk environment is going to say safety is their number one concern,” he says. “But the guys on the oil rig are 50 miles offshore with no management oversight, making real time decisions with safety implications.”

Hofmann attended the Coast Guard hearings, where crew members from Deepwater Horizon gave eye-witness accounts of what they experienced that day: explosions, fires, loss of lighting, black smoke, and toxic gas. “That makes the reality of the accident sink in,” Hoffman says.

The committee then visited the sister rig of Deepwater Horizon, the Deepwater Nautilus. The design and lay-out of the two rigs was the same. As he stood on deck, picturing where the explosion would have happened, Hofmann thought of the testimonies he heard a few days earlier. “There were accounts of people jumping off the rig to escape the fire,” he says. It’s a 60-foot drop. “When you’re standing on it and peering over the edge, you realize that’s not a trivial decision.”

Hofmann was asked to review the organizational structure of the rig. According to him, it breaks down like this: BP leases a rig from a company called Transocean. When the rig is moving, the rig captain (a Transocean employee) is in charge. When the rig is stationary and the drilling gets under way, the BP company man is in charge. But there are also subcontractors who conduct different pieces of the drilling work. When drilling starts to go wrong, there can be uncertainty about who has the authority to make emergency decisions.

“I look at who ultimately has the decision-making power in this kind of crisis situation,” Hofmann says. “Do crew members know who to turn to?”

On Deepwater Horizon, they didn’t. The committee’s report states that “confusion existed about decision authority and command and may have impaired timely disconnect.” Hofmann and his colleagues wrote recommendations based on this finding, including a definition of command at sea, which should “be absolutely unambiguous, and should not change during emergencies.”

Hofmann explains that there are two approaches when it comes to safety: personnel and operational.

Personnel safety includes measures such as wearing safety goggles and hard hats, using tools appropriately, and labeling unsafe areas. Operational safety focuses on proper decision making, such as checking the drilling output, and ensuring that equipment is functioning properly.

Personnel safety on Deepwater Horizon was so good the crew was receiving an award for it. But they weren’t applying those same standards to their operational safety. The drill pipe in their well should have been secured with 21 centralizers. They only used six. That was just one of many operational safety measures that went overlooked.

Why were they so diligent about safety precautions in one area and negligent in another?

“The physical risks on the deck of an oil rig are so in-your-face, working with huge pipes and heavy equipment,” Hofmann says. That constant physical risk forces crew members to be on the ball, for example, dropping a large pipe could result in a shattered foot that would have to be amputated.

Operational risks, on the other hand, are more abstract, involving measurements, numbers, and codes. There are several areas of psychology suggesting that it may be easier to take an operational risk because such risks are out of sight, out of mind.

“We think concretely about things happening right in front of us, right now,” says Hofmann, who holds a PhD in industrial and organizational psychology. “But we think more conceptually about things happening in the future.”

That’s exactly how a BP operations engineer was thinking when he finished the cement seal job using six centralizers, instead of the standard 21. According to court documents, a drilling engineer warned him of a “severe risk” of natural gas leaks with only six centralizers in place, but the BP engineer responded in an email saying, “Who cares, it’s done, end of story, will probably be fine and we’ll get a good cement job.”

Hofmann says his experience serving on the committee is informing his own research. He’s curious to know “what it is about how people think—that they can do one part of the job well and not the other part. How is that distinction made?”

 

Written by Mary Lide Parker for Endeavors

7.8.2013