The Safer Choice

What is: foreseeability?

First published in Health and Safety at Work Magazine, June 2016

Bridget Leathley continues her exploration of the foundation concepts of safety management by looking at what practitioners can be expected to predict and control.

Foresight involves a careful balance between sticking with the obvious and dealing with the fanciful. Imagination, aided by structured approaches, can suggest a wide range of possible outcomes — and then risk assessment will help us to decide which ones must be managed.

In What is: reasonably practicable I looked at how safety managers, regulators and courts consider what is “reasonably practicable”. For the courts and regulators at least, what is reasonably practicable can hinge on what is seen as foreseeable.

In its position paper Reducing Risks, Protecting People (R2P2) the HSE explains: “So as not to impose unnecessary burdens on dutyholders, HSE will not expect them to take account of hazards other than those which are a reasonably foreseeable cause of harm, taking account of reasonably foreseeable events and behaviour.”

Neither the Health and Safety at Work Act nor the Management of Health and Safety at Work (MHSW) Regulations use the word “foreseeable”. But L21, the code of practice for the MHSW Regs (since withdrawn) includes consideration of “any member of the public who could foreseeably be affected” as one of the conditions for an adequate risk assessment. L21 explains that those conducting risk assessments “would not be expected to anticipate risks that were not foreseeable” and should concentrate “on the broad range of risks that can be foreseen”. Other references in L21 refer to “foreseeable events” and “foreseeable emergencies” requiring emergency plans.

Case by case

The term arises more frequently in case law. The most often quoted case in the general law is Donoghue v Stevens from 1932; the snail-in-the-ginger-beer case. The judge stated: “It is sometimes said that liability can only arise where a reasonable man would have foreseen and could have avoided the consequences of his act or omission.” A “reasonable man” had been characterised by the courts as early as 1903 in McQuire v Western Morning News as “the man on the Clapham omnibus”.

But it is not always sufficient when managing health and safety to use only the foresight of the reasonable person. As well as common knowledge, we can be expected to call on two other sources.

First, there is industrial or professional knowledge, or what Duncan Spencer, strategic safety manager for the John Lewis Partnership, calls “what others like us know or should know”.

Then there is expert knowledge, which may require an organisation to seek external help. A member of the public might know very little about asbestos, for example; the manager with responsibility for health and safety should be aware of the possibility of asbestos in a building; and the asbestos consultant they employ should have the expert knowledge needed to carry out a survey to determine the level of risk and the controls needed.

Case law does not offer a clear definition of what level of industry or expert knowledge is expected. The case of R v Electric Gate Services illustrates how difficult it is to establish how much foresight “people like us” should have.

Nine-year old JK was crushed between an electric gate and a pillar as he reached through to press a button to open the gates. At the initial trial the judge considered the circumstances “too remote” to have been foreseen. But the appeal judge concluded that, as the accident had happened, “the risk was not fanciful and was more than trivial” and that the onus was on the defendant to demonstrate it could not have been foreseen or that there was no reasonably practicable way of preventing it.

By contrast, in Micklewright v Surrey County Council, where Christopher Imison was killed by a branch falling from a tree, the judge stated: “It is important when considering whether the owner or occupier has complied with his duty to avoid using the benefit of hindsight.”

Just because something happens, this does not make it foreseeable. It is, of course, foreseeable that branches will fall off trees. But even experts cannot always determine which branch will fall from which tree and when. This is recognised in Bowen and others v the National Trust. Though the claimant stated that “adaptive growth” bulges on the tree demonstrated that branches were unsafe, the judge agreed with the defendant’s expert witness that “signs present would not have indicated … a foreseeable or likely failure of the branch”.

In Baker v Quantum Clothing, claimants sought compensation for damage to hearing resulting from noise levels above the limits set in the Noise at Work Regulations but below the 90dB(A) limit in the 1971 Industrial Health Advisory Committee code of practice.

Judge Lord Mance stated: “Whether a place is safe involves a judgement, one which is objectively assessed of course, but by reference to the knowledge and standards of the time.” But the appeal was dismissed on the ground that the employer should still have made efforts to control the risk from noise below this limit.

It is foreseeable that branches will fall off trees. But even experts cannot always determine which branch will fall from which tree and when

Though L21 (since withdrawn) referred to “foreseeable events” and “foreseeable emergencies” the law focuses on foreseeable risks. In R v Corus UK, the steel maker had claimed that the type and magnitude of the blast furnace explosion that killed three people had been unforeseeable. However, Mr Justice Lloyd-Jones considered “while the precise mechanism and extent of the explosion were not foreseeable, the risk of death and serious injury, arising from the situation which the defendant failed to take reasonably practicable measures to prevent, was clearly foreseeable.”

Employers are also expected to foresee that people will do the wrong thing. Uddin v Portland Cement from 1965 illustrates that, even where an employee behaves in a bizarre way — climbing above machinery to catch a pigeon — the employer can be held liable for a safety breach (failing fully to guard the machinery), though Uddin’s contributory negligence reduced his compensation.

More recent cases support the view that employers must assess what could happen if people do the wrong thing. A two-man team from waste contractors Veolia was removing litter from the side of a dual carriageway. While one person collected litter on foot, the other drove the van slowly behind. The van was struck from behind by a lorry, and propelled into the walking employee, killing him. In defence, Veolia claimed the accidents had been caused by the behaviour of the two drivers, something not in the scope of their assessment.

The court of appeal ruled that Veolia should have considered the risks to employees and others of carrying out the task in this way, including the possibility that the van driver and other road users might make mistakes.

No acts of God

In 1997, Roger Bibbings, occupational safety adviser at the Royal Society for the Prevention of Accidents wrote: “Very few accidents can be said to be acts of God or matters of pure chance. With hindsight it is almost always the case that most accidents could have been foreseen and their chances of occurring or having such harmful consequences could have been reduced to a very low level (or even eliminated) had the kinds of control been in place.”

An 1876 judgment in the case of Nugent v Smith defined acts of God as “natural causes, directly and exclusively, without human intervention and could not have been prevented by any amount of foresight, pains and care, reasonably to be expected”. Though “acts of God” cannot be controlled, in the 21st century our ability to predict them is much improved. Earthquake zones and flood plains are well mapped, and weather forecasts provide some hours of warning of heavy rain or storms. As our ability to predict events improves, so our responsibility to control the risk increases. Building in San Francisco without considering earthquakes, or constructing an offshore platform without considering the 100-year (and perhaps the 10,000-year) wave, cannot be defended. But would an organisation in Blackpool have been expected to consider proofing against an earthquake before the 2011 earthquakes linked to hydraulic fracturing (fracking) to extract shale gas?

As fracking increases, should organisations nearby now have to consider the impact of an earthquake on their operations? The HSE’s guidance Reducing Risks, Protecting People provides a steer: “Whether a reasonably foreseeable, but unlikely, event — such as an earthquake — should be considered depends on the consequences for health and safety of such an event.” If you run a laboratory using biological hazards or a nuclear power station perhaps you should think about earthquakes wherever you are; for a supermarket or an office building, such consideration may be disproportionate.

But can you plan for unforeseeable events? For most organisations, terrorist attacks, an aircraft flying into a building or a helicopter hitting a crane are beyond what may be foreseen. Evacuation planning normally considers fire or bomb threats. If the fire drill always involves people using the same exits, the same route and the same assembly area, they will be unprepared for the unforeseen. But by changing the scenario each time, people will be better able to cope with the unexpected. Many organisations run drills with part of the evacuation route blocked off, closing a staircase, for example. Extend this by putting both the alarm and telephone systems out of action, so the evacuation message must be passed around verbally. Teach people to be adaptive. Then if the aliens do land and take over your building, you will be well prepared.

Foresee more clearly

The variety of decisions by the courts suggests what is considered foreseeable may depend on who is judging the case. But in the absence of a clearly defined requirement, if we want safe and healthy places for people to work, we can work to improve our foresight. The techniques for doing so should be familiar:

  • sound risk assessments
  • accident and near-miss reporting
  • reviewing the lessons from accidents in other organisations.

In the joined appeals of Tangerine Confectionery and Veolia ES (UK)  v R, Lord Justice Hughes describes a risk assessment as “an exercise in foresight”. Tangerine had attempted to use as a defence that the death of an employee after he was crushed by a sweet processing machine had been unforeseeable. However, simply not foreseeing a risk is not a defence. In the Tangerine case, the company had no risk assessment of the machinery. If you do not look you will not see.

A risk assessment must be carried out by properly qualified individuals working with the people who carry out the tasks and reviewed by people who understand the hazards and the controls, for normal and foreseeable abnormal operations, such as maintenance, cleaning and shutdown.

Risk assessment starts with identifiying the hazards. This may need the foresight of more than one person. Structured brainstorming techniques such as HAZOP (hazard and operability studies) and HAZID (hazard identification) can bring together the foresight of several experts. HAZID is often carried out at the design stage, but can be applied to a system already in use. Key words suggesting “deviations” are applied to each part of the system, for example “not done”, “more than” and “less than”. Trivial consequences can be ruled out quickly. Where a consequence is serious, you can consider whether there are sufficient controls to prevent the deviation.

For more complex systems, fault tree and event tree analysis can be used to foresee what could happen. Such techniques, correctly applied to the combination of the factors “O-ring damage” and “low temperatures” might have prevented the 1986 Challenger space shuttle disaster.

What if...?

Incident reporting systems usually take one of two approaches: classifying the incident by the severity of what did happen, or of what could have happened. While the “what if?” severity test can result in exaggerations, it can also help identify unforeseen outcomes.

The accounts of accidents in the news section of every issue of HSW can be used in at least two ways. The reader can look smugly at the misfortune of others, glad that “it couldn’t happen here”. Or, for each story of death or injury, or prosecution or compensation claim, can ask “how could that happen here? What is there to stop it happening?” or “if that happened, what mitigation do we have?”

In particular, health hazards may be more difficult to foresee than safety hazards without looking at the experience of others. Tony Cox, specialist fellow of the International Institute of Risk and Safety Management (IIRSM), gives a useful example. One of the firms he advises restores vintage cars, a task that sometimes involves reshaping lead fillers.

“It is a traditional practice and had not raised any concerns in their minds,” He explains. But the July 2012 IIRSM newsletter reported a recycling firm had been fined for exposing workers to lead, when handling lead at low temperatures.

“We wondered if the client’s low temperature lead work might be more hazardous than they thought, so instituted appropriate health monitoring for the bodyshop workers, including blood tests.”

Looking at widely publicised accidents can prompt imagination in hazard identification sessions, or be the basis for a review of existing risk assessments. Try the Kansas City Hyatt Regency for construction projects or the King’s Cross fire for emergency planning. In the light of the 2011 Fukushima Daiichi nuclear power station meltdown the rest of the world has reviewed its position on nuclear power; despite the outward rhetoric that “it couldn’t happen here”, similar reviews took place after the meltdowns at Three Mile Island in 1979 and Chernobyl in 1986.