News

Just Culture: A Product of ‘To Err is Human’

November, 1999 Institute of Medicine (IOM) report, “To Err Is Human: Building a Safer Health System” decried that “health care in the U.S. is not as safe as it should be – and can be.”1 The report estimated that as many as 98,000 patients die in hospitals each year as a result of medical errors that could have been prevented. New research estimates up to 440,000 Americans die annually from preventable hospital errors.2 This puts medical errors as the third leading cause of death in the U.S., after heart disease and cancer.3

The costs associated with medical errors do not end with increased numbers in patient mortality. It contributes to loss of trust in the healthcare system and diminished satisfaction by both patients and health professionals. Patients may experience longer hospital stays or disability while health professionals experience loss of morale and frustration at not being able to provide the best care possible. Society bears the cost of errors as well, in terms of lost worker productivity, reduced school attendance by children, and lower levels of population health status.

IOM report on medical errors has sparked debate among U.S. health policy makers as to the appropriate response to the problem. However, one of the report’s main conclusions was that the majority of medical errors do not result from individual recklessness and blame does little to make the system safer or prevent someone else from committing the same error. On the contrary, it leads to counter-productive behaviors to hide information.

IOM’s recommendations to improve patient safety included:

  • Establishing a national focus to create leadership, research, tools, and protocols to enhance the knowledge base about safety.
  • Identifying and learning from errors by developing a nationwide public mandatory reporting system and by encouraging healthcare organizations and practitioners to develop and participate in voluntary reporting systems.

These recommendations have led to important developments to foster a culture of openness and disclosure to facilitate learning about safety rather than focus on punishing well-meaning professionals who periodically make mistakes. These include:

Patient Safety Organizations (PSOs)

In July 2005, Congress developed the federal Patient Safety and Quality Improvement Act (PSQIA).4 In response, PSOs were launched in 2009 and within just one year 100 had been listed by AHRQ. The PSQIA and the Patient Safety Rule reinforce a safety culture that encourages and allows healthcare providers to safely report and share information about vulnerabilities within the healthcare system.5 PSOs afford protection from fear of discovery or reprisal for physicians and healthcare providers that voluntarily report adverse event information. The program intentionally shields PSO work from most regulatory reporting programs.

Implementing a Just Culture

In recent years, a movement known as Just Culture has emerged in a number of high-consequence professions, including aviation and healthcare (particularly within hospitals).6 Historically, the predominant model in many workplaces has been a punitive one in which mistakes are met with disciplinary action. This is a dynamic that not only fails to address the actual causes of errors, but also one that perpetuates an environment of fear.

In any field, even the most well-trained and diligent worker is still a human being who will make mistakes. Experts have discovered that punitive systems – those that are focused on errors rather than at-risk behaviors and on blame and punishment rather than improving the root causes – actively discourage people from reporting mistakes. In such environments, people tend to report only the mistakes that they cannot hide. Since this approach denies researchers and supervisors access to valuable information about bottlenecks and vulnerabilities, it stifles our understanding of how things break down and prevents improvements to these systems.

A Just Culture seeks to create a learning culture that balances transparency and accountability. It recognizes that many errors represent predictable interactions between human operators and the systems in which they work. It recognizes that competent professionals make mistakes and acknowledges that even experienced professionals will develop unhealthy norms (shortcuts, “routine rule violations”).

In his book, “Whack-a-Mole: The Price We Pay For Expecting Perfection”,7 David Marx, a pioneer in advancing Just Culture, introduces a basic structure of responding to specific categories of unsafe actions this way:

  • Console the human error
  • Coach the at-risk behavior
  • Punish the reckless behavior… independent of outcome

Unrealistic Expectations Lead to Non-Compliance

Too often, there is a gap between an organization’s policies and its reality. For example, one hospital sought to reduce problems of patient misidentification by requiring that four identifiers be examined when labeling a specimen for the lab.8 The predictable result was that the staff ignored this requirement and continued to examine only two. The larger problem was that this served to normalize deviance, weakening the authority of all policies and procedures in the institution. Another example is hand hygiene compliance. According to the CDC, recent studies place hand hygiene adherence in hospitals at between 29 percent and 48 percent.9 As described by Voss and Widmer, expecting perfection and 100% adherence is unrealistic, and we must “put an end to the reflex response that healthcare workers are neglectful of hand hygiene, which, far from helping, only demoralizes them further.”10

The movement towards recognizing that healthcare workers are human who will make mistakes and encouraging openness to improve our systems instead of focusing on punishment promises to promote realistic expectations and enhance patient safety.

Does your organization encourage suggestions on improving systems?