Signs of danger21 August 2018

Oil & Gas UK held its Safety 30 conference in June to mark three decades since the Piper Alpha disaster. Lord Cullen, who conducted the inquiry into the tragedy, used the conference to illustrate how ‘signs of danger’ need to be recognised and acted upon by everyone, from the boardroom to the workplace, to prevent accidents or limit their extent. Adam Offord reports

At around 10:00pm on 6 July 1988, Piper Alpha (main image), a North Sea oil production platform, located approximately 120 miles north-east from Aberdeen, exploded, creating oil and gas fires. The incident saw 167 people lose their lives and many more horrifically injured, while the total insured loss was around £1.7 billion.

An inquiry into the disaster, by Lord Cullen, revealed a catalogue of management failings, as well as construction, engineering and operational inadequacies, reshaping the industry’s approach to major hazard management.

Now, 30 years on, a ‘Safety 30’ conference that ran from 5-6 June, organised by the Oil & Gas UK trade body, in association with the International Regulators’ Forum, has focused on how the legacy of Piper Alpha shaped current operations, and how the oil and gas industry continues to create an even safer future.

Lord Cullen used his keynote speech to remind delegates of the need to build on safety lessons drawn from the investigations of Piper and other major accidents, both onshore and offshore.

“When I read reports about major accidents, I am struck by the fact how frequently they have been preceded by signs indicating danger,” he said. “But those signs were not recognised or effectively acted on to prevent the accidents in question or, at any rate, to limit their extent. Signs of danger may take a variety of forms: for example, there may have been a previous accident or incident, or perhaps a report pointing out danger, or signs of danger in the workplace or encountered in the course of work. To illustrate that, I’m going to talk about a number of accidents. I will use them to illustrate failures to heed the signs of danger, ranging from failures in the boardroom to failures at the workplace.”

Previous accidents and incidents

Lord Cullen told delegates that, first, it’s important to find out not only what precipitated an event, but also the factors underlying it. “It would be perilous to ignore the latter, since they give rise to future accidents or exacerbate the consequences,” he said.

Columbia space shuttle disaster, 2003: Cullen explained how the Columbia Space Shuttle Investigation Board said, amongst other things: ‘When causal chains are limited to technical flaws and individual failures, the ensuing responses aimed at preventing a similar incident in the future are equally limited. They aim to fix the problem and replace/retrain the individual responsible. Such corrections lead to the misguided and potentially disastrous belief that the underlying problem has been solved’.

He added that “it’s not much good” having an investigation, if it doesn’t lead to a lasting improvement in safety. “In other words, in results being embedded in the assessment and control of risk, and reflected in the way in which work is tackled and done.”

Piper Alpha: “On the night of the disaster, operatives tried to start up a spare pump for re-injecting condensate into an export oil line,” said Cullen. “Unknown to them, a relief valve had been removed from the pump’s delivery pipe and in place was a blank blind flange that was not leak-tight. Condensate escaped and readily found a source of ignition. The explosion blew down a wall, next to where the crude oil was extracted, and there was a large oil fire. That was the start of a chain of events that led to the disaster. Now, it’s important to ask why the operatives didn’t know the position? That was due to a lack of communication, of information at shift handover, and deficiencies in the permit to work procedure.”

Conference delegates heard how, nine months before the disaster, a rigger had been killed, with the accident due to the night shift “improvising in the course of a lifting job”, without an additional permit to work, and to lack of information from the day shift.

“A board of inquiry investigated the accident, but this report did not examine the adequacy or quality of the handover between maintenance lead hands,” Cullen added. “Some actions were taken, but they had no lasting effect on practice. Management failed to recognise the shortcomings in the permit system and the handover practice.

“Whether by direction or inaction, they failed to use the circumstances of particular incidents to drive home the lessons to those that were responsible for safety on a day-to-day basis. There were no laid-down procedures for handovers and little, if any, monitoring of them.

“As for the permit to work system, it was, in my words: ‘Being operated routinely in a casual and unsafe manner’.”

Kings Cross Station fire, 1987: The disastrous fire was triggered when a passenger dropped a lighted match through a wooden escalator. But a contributing factor was the “failure of London Underground to carry through proposals that had followed earlier fires”, such as improvements in the cleaning of the running tracks and the replacement of wood with metal, Cullen told the room.

“It appears that no one owned the recommendations and accepted responsibility for seeing they were properly considered at the appropriate level,” Cullen explained, adding that the chairman of the investigation said:

‘There was no incentive for those conducting them to pursue their findings or recommendations or for others to translate them into action’.

“So, these recommendations got lost. The rectors of London Underground subscribed to the received wisdom that underground fires were an occupational hazard.”

Reports about danger

Signs of danger may also appear in safety reports, delegates at Safety 30 were told.

Piper Alpha: Cullen explained: “The initial explosion, and fire that followed within an hour and a half by the rupture of a number of gas risers, which had been routed through the platform... led to the loss of normal means of evacuation, the destruction of the platform and huge loss of life. There were no subsea isolation valves.”

He observed how management could have been in “no doubt of the grave consequences to the platform and its personnel”, in the event of a prolonged gas fire.

“A report from consultants in 1986 advised that such a fire would be almost impossible to fight and the gas pipelines would take hours to depressurise. Earlier in 1988, a report by the facility’s engineering manager advised that, if a fire fed by a large hydrocarbon inventory were to break out, structural integrity could be lost in 10-15 minutes. A report in 1987 pointed to an even greater hazard to personnel and the plans for platform abandonment. In the event, those reports predicted what actually happened on the night of the disaster,” Cullen explained.

“Management had not accepted that this would happen. They had rejected the installation of subsea isolation valves and fire-proofing of structural members as being impractical. They had relied on emergency shutdown valves and limited deluge system, both of which were wrecked by the initial explosion. They had not carried out a systematic identification and assessment of the potential hazards or put in place adequate measures for controlling them, but had relied on a qualitative opinion. It showed, in my view, a dangerously superficial approach.”

Workforce concerns

Management should also be alert to safety concerns raised by members of the workforce and should be inquisitive about their experiences – there should be a reporting culture.

Herald of Free Enterprise, 1987: The Roll-On, Roll-Off ferry sank in 1987 when the bow door, through which vehicles could enter, had been left open. Delegates at the conference were told how, after other ships had previously sailed with their doors open, captains had asked for indicator lights.

“The head office managers had rejected that request and treated them with derision,” Cullen said. “The directors had failed to take action on an official recommendation that a person onshore should be designated to technical and safety aspects of the operation of vessels.

“The attitudes described in the words of the inquiry revealed a staggering complacency. After the sinking, indicator lights were fitted to the remaining ships – that took just a few days.”

Danger encountered in the workplace

“I now turn to cases in which the signs of danger were in the workplace or encountered during the course of work,” Cullen said. “However, due to attitudes to them, they were not recognised or acted on.”

Buncefield oil storage depot, 2005: Cullen told the room how a sticking gauge and inoperable high-level switch “led to the overflow of petrol, the ignition of vapour in a cloud and a massive explosion and fire”.

There had been a “lack of adequate response to the fact that the gauge had been sticking and a lack of understanding about the switch,” he explained. “Various pressures had created a culture of keeping the process operating, whilst the primary focus and process safety did not get the attention, resources or priority that it required.” He added that the official investigation of the incident found that there were clear signs that the equipment was not fit for purpose, but that no one questioned why or what should be done about it, other than ensuring a series of temporary fixes.

Texas City oil refinery, 2005: This explosion happened when vapour rising from an overflow ignited. Cullen explained that the investigation revealed “a large number of inefficiencies” in the management of process safety.

But among those factors, which BP found underlying what happened, was “a poor level of hazard awareness and understanding of process safety on the site that led to people accepting levels of risk that were considerably higher than compared with other installations”.

There had also been a number of fires on the site that had not been investigated. “The general reaction of the workforce to them appeared to be that these were nothing to worry about,” Cullen added.

Signs treated as ‘normal’

People also become accustom to things happening in the workplace and don’t recognise them as dangerous.

Columbia space shuttle disaster, 2003: During take-off, a piece of insulation foam broke off and struck the orbiter, causing a loss of thermal protection. Cullen explained how the investigation board found that over the course of 22 years foam strikes had been ‘normalised to the point where they were simply a maintenance issue’.

“The board remarks that learned attitudes about foam strikes had diminished awareness of danger,” Cullen told delegates. “The managers were, in their words: ‘convinced without study that nothing could be done about such an emergency. Intellectual curiosity and scepticism that solid safety culture requires was almost entirely absent’.”

THORP, Sellafield, 2004: The leaking of dissolver product liquor at Thermal Oxide Reprocessing Plant (THORP), Sellafield, “originated in design changes”. However, Cullen explained how investigation of the accident showed that an ‘underlying explanation was the culture within the plant, which condoned the ignoring of alarms, non-compliance with some key operating instructions and safety related equipment not being kept in effective working order for some time’. “This had become the norm,” he added. “In addition, there appeared to be an absence of a questioning attitude. These cases show how attitudes to signs of danger require vigilance in the management of safety.”

Bringing the examples together

By drawing all of the examples together, they show how in some cases there was an investigation but it was “limited in scope, or it was superficial, or its results were not driven home to forestall further trouble,” Cullen explained. In other cases, there was “no investigation, the signs did not give rise to concern and some signs were treated as commonplace or misread as innocuous, so there was no action”.

Cullen concluded by questioning what underlay these attitudes to safety? “For my part, they seem to indicate at least three factors,” he said. “Firstly, poor safety awareness – a prerequisite of which is competence for that purpose. Secondly, a failure to give priority to safety and, thirdly, a failure to show or instil in others, responsibility for identifying and resolving safety issues.

“The commission that investigated the accident at Three Mile Island coined the phrase: ‘an absorbing concern with safety’. So should it be for us also.”

To view Safety 30 highlights and speaker videos, visit Oil and Gas UK on Vimeo at:

Adam Offord

Related Websites

This material is protected by MA Business copyright
See Terms and Conditions.
One-off usage is permitted but bulk copying is not.
For multiple copies contact the sales team.