What the Film Industry Can Learn From Patient-Safety Protocols

A candlelight vigil in Burbank, California, for cinematographer Halyna Hutchins, who was accidentally killed by a prop gun fired by actor Alec Baldwin. Photo: David McNew/AFP

The recent, accidental killing of cinematographer Halyna Hutchins is surprising and tragic because it should not have occurred. Not because an accident like this has never happened before, but because such mistakes should not be part of any assumption of risk in movie production. There are ways to ensure that dying on set from a prop gun does not happen.

The reaction to Hutchins’ death has been swift and typical. The common approach that companies (and people in general) take after such an accident is threefold: Blame, Retrain, and Pledge to do better.

Searching for a culprit is natural as our shared need for answers and our sympathy for the victims and their loved ones require a face to direct our pain.

Yet, the occurrence of accidents in complex systems is rarely improved through individual personnel changes or pledges simply to do better. Unless fundamental change in how film productions engage in safety protocols takes place, this tragedy will not be the last.

Though not obvious, movie production and healthcare delivery have important things in common.

Both incorporate complex activities requiring the simultaneous work of many individuals performing various specific tasks that require training. Both are hierarchical, and the individual in charge may make mistakes through failing to heed the warning of sometimes better-positioned subordinates.

As it turns out, improper safety measures can lead to accidental death in both industries. While this recognition led to a transformation in healthcare, the film industry has not yet learned this lesson.

Systems-Based Approach

In 1999, the Institute of Medicine released a startling report entitled To Err is Human: Building a Safer Health Care System. The report made the shocking – but true – claim that many hospital deaths occurred as a direct result of errors and accidents. This finding shook the foundation of the medical profession, whose motto primum non nocere (first, do no harm) is the basis of the ethical practice of medicine.

Health care’s response to this report was profound. Though there is still much work to be done, patient-safety initiatives turned to a systems-based approach as a better model to deal with the problem.

Rather than looking for individuals at fault, a systems-based approach will examine the multiple factors that affect the process of delivering a product or service. It considers not only how individuals perform specific tasks to which they are assigned but also how each task interacts with the overall operation.

Quality improvement and efficiency then become matters of training people at the individual level and ensuring checks between steps to reduce the number of potential errors. The time and efficiencies one may lose by double-checking are offset by the gains of having a better and safer product or service.

Swiss Cheese Model of Accident Causation

The classic paradigm of a systems-based approach is the “Swiss cheese” model of accident causation. Used in healthcare and aviation, engineering, and emergency services, the model draws an analogy between different steps in a process and various slices of Swiss cheese stacked together.

Something can pass through the cheese slices only when the holes in the Swiss cheese slices fit together to create a path. So too, errors will only go through if the risks or gaps in the process are aligned in such a way to allow them. Creating gap stops in the process will minimize the number of errors or accidents that occur.

Checklist Manifesto

Think about the technological innovation of patient safety, about which American surgeon Atul Gawande wrote a book: The Checklist Manifesto: How to Get Things Right. A simple approach used by surgical teams and airline pilots to avoid mistakes of ineptitude, it serves to stop those failures that arise because people do not make proper use of what they know rather than mistakes due to ignorance.

The case of the Rust shooting is an example of ineptitude. Everyone on the set knows that it would be a big mistake to shoot someone with real bullets. The checklist is not a huge innovation, but it does force individuals to look at the bigger picture. It recognizes that the process is complex, that it incorporates various people and inputs, and that everything must be set for the process to be implemented successfully.

However, a systems-based approach will only work if the system – and all the checks that it imposes – also empowers individuals to utilize those checks and speak up when there is a problem. The difficulty with the checklist – let alone other forms of systems innovation – is that it must be utilized. If you lead a horse to water but it does not drink, you really haven’t done much to quench the horse’s thirst.

‘Break a Leg’

When the culture of an organization disallows communication to flow up as well as down, checklists and other quality improvement measures don’t work.

An article published last year examined 143 cases of surgical confusions between 2006 and 2017 and found that 92 of the cases (64.3 percent) were preventable. Worse was that most of these errors resulted from not following protocols properly – not that there weren’t protocols to be followed – and ineffective communication was a common factor that led to the error.

It is time for the film industry to take a page from the culture of safety in healthcare delivery and other industries. The industry needs to reimagine its safety measures to incorporate systems as well as people. Maybe then movie producers and actors will no longer have to worry about “breaking a leg.”

Related Post