fbpx
Call us toll-free: 800-878-7828 — Monday - Friday — 8AM - 5PM EST
Call us toll-free: 800-878-7828 — Monday - Friday — 8AM - 5PM EST

Guest post: Considering moral injury

Guest post: Considering moral injury
June 24, 2021 Rachel Tirabassi

By Howard Rodenberg, MD, MPH, CCDS for ACDIS CDI Blog

One rarely thinks of CDI as having a moral dimension. But when we do think about it, we consider ourselves doing the “right thing” for patients to have their complexity of illness fully documented within the medical record. If we divorce ourselves from the financial impact we have on code assignment and quality metrics (which, if we’re honest with ourselves, is why administration keeps us around), our belief is that what we do helps the clinicians know what went on with the patient when the next admission or office visit rolls around. Because our sails are buffeted by this moral breeze, we don’t hesitate to ask queries when they’re indicated, knowing the Ship of CDI is headed True North. But there are some circumstances where we might reconsider.

Let me offer this scenario. A patient came into the hospital with a relatively routine complaint. During workup, however, a more severe problem was identified, such that the healthcare team found themselves unable to manage the clinical issue and the patient died while under care. The attending physician was crushed, as were others who cared for the patient. Mental health professionals were called in to help. As I heard the story, it was nothing but tragic. To this day I understand that some of those caring for the patient, physicians, and nurses remain mired in self-loathing and self-doubt, wondering what they could have done to prevent a devastating outcome.

In the midst of all this, an opportunity to query was identified. The patient’s anatomic diagnosis had not been specified by the physician. Nursing notes indicated the site of the problem, but the attending’s note did not. Clearly there’s a query here, right?

In a previous work for the ACDIS Blog, I had written that working with my own CDI staff has led me to consider that queries must meet one of three criteria to be written:

  • Financial impact queries make a difference to reimbursement.
  • Regulatory queries identify or rule out conditions subject to regulatory review and penalty.
  • Clinical queries better reflect patient care to providers reviewing the record at a later date.

I felt then, as I feel now, that asking a query simply to query is most likely an exercise in spinning wheels and boosting metrics rather than an efficient use of CDI time. As I reflect upon the case above, however, perhaps there’s another reason to avoid a query: When the query might inflict moral injury.

I want to elaborate on the concept of moral injury. This is a hot topic in the literature of physician wellness, and especially among my own field of emergency medicine. It seems to be replacing the term “burnout,” which carries the implication that someone the clinician has failed; and we’re also seeing that the term “resilience” means nothing in the face of circumstances beyond one’s control. Instead, what’s happening to physicians is considered as moral injury, violations of professional ethos from external forces—loss of professionalism, loss of autonomy, loss of status, being placed in situations which prevent them from fulfilling their obligation to provide the best in patient care.

How does the concept of moral injury play out in the scenario we’ve described? We would certainly be within the scope of CDI work to ask that query for anatomic specificity. Would that query contribute in any way to patient care, regulatory burdens, or reimbursement? Perhaps. But I would contend that this particular query might represent a kind of moral injury, imposing another external burden upon a clinician already in distress.

I’m a classic sci-fi nerd, and years ago Isaac Asimov developed the Three Laws of Robotics to dictate how robots should behave in reference to living beings. They are, in hierarchical order:

  1. “A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.”

Decades later, Asimov realized there need to be a further underlying principle which supersedes the initial hierarchy, called the “Zeroth Law:”

  • “No machine may harm humanity; or, through inaction, allow humanity to come to harm.”

I’m suggesting that we might add a similar overarching category to our previous three reasons to query; that the query neither cause nor reinforce moral injury to the clinician. “Primum non Nocere”— First, Do No Harm. It the basic rule of patient care, and should underlie all we do.