STATE DEPARTMENT OF HEALTH, CMS, AND THE JOINT COMMISSION

Alan Pakaln
6 min readApr 12, 2018

It’s not a fun subject.

No sexy alarmist headlines to see, no Tweets or Facebook Likes:

YOUR HOSPITAL EQUIPMENT MAINTENANCE IS ON SCHEDULE AND OUT OF DATE!

But this is true for many institutions.

Fact: Hospitals not reporting 100% on scheduled inspections are probably not compliant, and may not receive Joint Commission Accreditation. This would present real pressure to many administrators — except reporting 100% on time inspections is relatively easy: you just need to follow a schedule of maintenance on time.

There is no requirement — and no reporting — of the actual numbers of equipment inspected: follow your schedule, ignore enough equipment to stay on time with that schedule, and you are 100% compliant.

This is not an opinion. This is stated policy, and your institution — if you work in one — follows this policy. It is the standard by which all hospitals must adhere, to be accredited.

There are reasons why The Joint Commission created this policy of schedule tracking rather than equipment tracking. And there are reasonable alternatives to this policy.

It is The Joint Commission that owns this policy and sets the regulatory direction for state health departments and Medicare. But engaging The Joint bureaucracy on this issue requires more than one voice.

Joint Commission Accreditation

Of the 21,000 Joint Commission surveyed institutions, no one knows the inspection rate for medical equipment — not despite Joint standards, but because of them. This fact represents possible patient safety issues, but because there is no available inspection data, no one knows how this plays out in terms of reported incidents: incidents must be reported, but maintenance completion rates are not required.

How is this possible?

The policy:
The Joint Commission accreditation standard for medical equipment periodic maintenance is 100%. However, The Joint does not require 100% of the equipment to be inspected on time; their policy requires that maintenance technicians follow an inspection schedule 100% of the time. A technician can adhere to a schedule regardless of the number of equipment pieces that are inspected: miss some equipment in a scheduled location, you can just move on to the next scheduled location. Technicians are motivated to stay on schedule to avoid being cited on a survey.

inspecting 100% of the medical equipment on time is not just difficult for many institutions, it is virtually impossible to achieve because some equipment invariably is “missing” during inspections: “floaters” like IV pumps travel from one department to another, equipment may be stolen or transferred to another department, rentals are returned, staff may hide equipment because of perceived shortages, a department may send something out for repair, equipment can be in use on a patient, or perhaps maintenance technicians are just not as thorough as they might be.

The problem:
Equipment not inspected can be “missing” or just not looked for during the prescribed inspection period. These numbers do not become part of the compliance calculation and reporting, even though some of the uninspected equipment may be in use on patients.

I am a biomedical engineer, concerned about The Joint Commission’s oversight of medical equipment maintenance. I have 30+ years experience overseeing the application of medical technology in New York City hospitals. I have participated in many accreditation surveys, and have written about this issue, both in journals and to The Joint Commission. See their response:
https://jointcommission-compliancestandard.org

Why would The Joint set a standard of 100%, when it is virtually impossible to inspect all equipment 100% on time? Two reasons.
1. 100% means The Joint does not need to justify setting anything lower (95%?; why not 96%? Or 94%?). What science could they use in their rationale?
2. The Joint can have just one standard for everyone: smaller facilities (with lower numbers of “missing” equipment), and larger facilities (with higher numbers of “missing” equipment).

The reason this standard is bad.
No one can tell from compliance reporting what the actual level of equipment inspections is, nor can anyone see how long some equipment remains uninspected. Also, less meaningful data means less diligence: what incentive to improve is there when there is an approved, guaranteed, and easy way to achieve 100%?

Reasonable alternatives?
First, show real numbers so everyone can know what the actual rate of inspected equipment is. If the rate is 95%, state that, and state why it is 95% and not higher. Second, show the percent and breakdown of equipment not located. If the inspection rate is 95%, explain the 5% so that everyone can assess the quality of inventory, and more clearly track and focus on what is not inspected and why it isn’t.

Periodic compliance reporting should look something like this. Using a total inventory quantity of 1,000 with 95% of inventory inspected on time, the remaining 5%, or 50 pieces, are past due. This uninspected or unlocated equipment (insufficient staffing is included here) should be explained — e.g., in shop, on patient, or other — along with numbers of intervals missed. And of course, at some point, equipment that is not located for some prescribed period can — after findings are reported, reviewed and signed off by various personnel — be removed from inventory.

What policy can The Joint Commission establish to promote maintenance safety, and be appropriate to all? If The Joint had been collecting real data for a period of time, perhaps a meaningful percent could now be set as a standard, tailored for different institution types and sizes. But as things stand now, meeting a standard must mean you meet or better the institution’s benchmark: e.g. 95% compliant with 5% uninspected or missing, means that the institution must meet or better those numbers in the next survey. Or satisfactorily explain why it can’t.

Real maintenance compliance numbers, trended over time, will place a clearer focus on the efforts made to improve performance. Real numbers measure actual results, and the surveyor always has the prerogative of accepting explanations for numbers that deviate from the norm.

The proposed fix keeps the issue of unlocated inventory front and center where it should be. What incentive is there to improve if — as is the case now — you rely on a system that all but guarantees 100%? When it’s likely you are not.

What you don’t know can hurt. I encourage you to look into this and ask The Joint Commission for their rationale.

_______________________________

Here is the last response I received from a recent attempt to address this issue with The Joint Commission.
“Dear Alan-
I have spoken with everyone you reached out to and I am responding on behalf of those staff. We want to thank you for your efforts expended in your letter from January 24, 2018. It appears that you have thoroughly researched the topic of equipment maintenance rates. However, the Center for Medicare and Medicaid Services (CMS) has approved our maintenance rate of 100%. The Healthcare Organization has the responsibility to prove to the Joint Commission surveyor that they are meeting that 100% requirement.
Regards, Dawn Glossa, Director Corporate Communications”

— — — — — — — — — — — — —

Briefings on Accreditation and Quality, July 1, 2017. George Mills, MBA, FASHE, CEM, CHFM, CHSP, Joint Commission director of engineering.

“The key is that you are going to be taking 100% on your “on time” because you were on time, and you knew what it was. And the fact that you couldn’t complete it wasn’t your fault or a penalty to your shop. The key then becomes whether your policy is robust enough to still make sure that you capture that equipment when it does show up. If it doesn’t show up after a second cycle, do you then remove it from your inventory as “not being in the building?” Maybe it went out with a patient to a nursing home or something like that; you never know. The point is that for your 100% compliance calculations, if you were there to do the work on time, take the credit for being there on time.”

— — — — — — — — — — — — —

Alan Pakaln is a biomedical engineer with 30+ years experience overseeing the application of medical technology in New York City hospitals. During this time I have participated in Joint Commission surveys. I have also written two peer-reviewed journal articles for the Association for the Advancement of Medical Instrumentation.

Article Citation: Alan Pakaln (2004) The Three Critical Issues I’ve Learned in 23 Years in Clinical Engineering. Biomedical Instrumentation & Technology: March 2004, Vol. 38, №2, pp. 119–121.

Article Citation: Alan Pakaln (2006) Proposed: A Standard Clinical Engineering Review Procedure. Biomedical Instrumentation & Technology: July 2006, Vol. 40, №4, pp. 315–318.

--

--

Alan Pakaln

Past profession in clinical engineering and continuing interests in sustainability and community development.