Real-time dashboards: Quality control for pharmacy practice
Tracking metrics can lead to workflow revisions and behavior changes that improve care quality
One of the easiest ways to comprehend the concept of quality control measures is to think about speeding down the road. There are many reasons why people speed. Perhaps you’re running late, there’s an emergency, or it’s just a beautiful day for a drive. Although most people understand the consequences of speeding, such as receiving a citation or possibly causing an accident, they accept the risk and speed anyway. If you see a cop parked by the side of the road up ahead, however, you immediately slow down and drive the speed limit. The cop acts as the quality control measure, which results in an improvement in behavior, noted Scott M. Mark, PharmD, Corporate Vice President, Corporate and Physician Applications at West Penn Allegheny Health System in Pittsburgh.
For decades, hospital systems have looked for ways to improve outcomes and safety by implementing quality control measures. Recently, health systems have begun to use real-time dashboards and scorecards to monitor and manage performance by measuring progress against set targets.
“Measurement drives behavior, so if you’re not measuring it and not looking at it, and no one is accountable, then you’re not going to improve,” said Cleveland Clinic’s Scott Knoer, PharmD, FASHP, who along with Mark gave a presentation at the 2012 American Society of Health-System Pharmacists (ASHP) Midyear Meeting & Exhibition that highlighted strategies to improve quality by tracking operational and clinical metrics through the implementation of dashboards.
The idea of quality control emerged in the 1940s. Since then, “quality has been packaged, repackaged, and packaged again,” said Mark. During the mid-1990s and late 2000s, many organizations turned to retrospective analysis to evaluate behaviors and control measures. In this once-popular methodology, once a problem was identified, process changes were implemented for a period of time, and the problem was reevaluated a few months later to make sure the process changes worked. The challenge with this type of intervention is that it doesn’t have staying power, Mark noted. It’s like temporarily slowing down while driving past a cop, but speeding up again once you are safely past, he added.
According to Mark, the key to successfully implementing meaningful quality control measures is to understand the underlying motivation or behavior behind the workflow. Otherwise the process change does not get ingrained in the behavior and the control measure is only a short-term fix.
Another critical component to implementing quality control measures is identifying where problems occur in the workflow. Mark highlighted an internationally acclaimed study conducted by Japanese consultant Sidney Yoshida, which was initially presented at the International Quality Symposium in Mexico City in 1989. Yoshida believed that only about 4% of an organization’s problems are known to top managers, 9% are known to middle management, 74% are known to supervisors, and 100% are known to frontline workers—a phenomenon dubbed the “iceberg of ignorance.” “The workers are the folks that really know what’s going on,” said Mark. This is where you have to look at workflows to figure out what is driving behavior, he noted.
Lean and Six Sigma
Over the past few years two models for improving quality have emerged: Lean and Six Sigma. The Lean methodology focuses on improving process speed and eliminating the “eight deadly wastes,” noted Mark. (See the October 2012 issue of Pharmacy Today for more information about these wastes.) Simply completing activities rapidly without control measures may in fact decrease quality, however.
The Six Sigma methodology is a quality tool that emphasizes reducing chronic problems and variation in processes. Sources of variation include poor process design, changing needs of a department, methods of measuring processes or controls, insufficient process capability, and varying skills and behaviors of employees, according to Mark.
He stressed that the Lean and Six Sigma methodologies will work only if they are based on an accurate picture of behaviors and workflows. Organizations need to “understand their strengths and weaknesses” before they start applying these programs to an initiative they want to achieve, Mark noted.
In collaboration with programs such as Lean and Six Sigma, real-time dashboards are a helpful tool that can be used to drive quality across a health system. Knoer, Chief Pharmacy Officer at the Cleveland Clinic, spoke to ASHP meeting attendees about the use of dashboards to visually track various metrics across the Cleveland Clinic health system.
One dashboard is called the business review deck, which contains user-defined key performance metrics. “This is used at the Cleveland Clinic in each department, and each department has things that they measure [and] things to benchmark, [such as] looking at quality and finances,” said Knoer. “We also keep track of key volume indicators, such as number of doses, to see if there are any trends.”
These metrics are recorded and reviewed on a regular basis. Examples of metrics on the dashboard include overall turnaround times, patient satisfaction, and the percentage of patients waiting less than 20 minutes for a prescription. According to Knoer, the goal is to have 90% of patients waiting less than 20 minutes. “Looking at that [metric] monthly has allowed us to see that we [had] a problem in one of our pharmacies,” said Knoer. Once this problem was identified, the workflow and other processes were revised to improve this metric.
Another visual tool at the Cleveland Clinic is the system quality dashboard, which spans the entire health system. It allows users to track metrics across the system, at an individual hospital, or at the nursing unit. The system quality dashboard shows core measures, readmissions, and HCAHPS (Hospital Consumer Assessment of Healthcare Providers and Systems) survey scores, and includes a financial dashboard.
Other visual management tools used at Cleveland Clinic include MedBoard, a medication tracking system that allows staff to track in real time where a particular dose is and where it was delivered; the CoPAT (Community Parenteral Anti-Infective Therapy) dashboard, which tracks quality and financial metrics for the CoPAT program; and the knowledge portal, which allows staff to manage automated dispensing cabinets proactively.
The benefit of using dashboards is that they track metrics over time and guarantee that staff sits down and talks about the information, noted Knoer. “We don’t use this to punish people, but [when] the same marker with the same accountability is read every month, it starts a conversation,” he added. These conversations lead to better processes and workflows.
In addition to tracking metrics to improve quality, dashboards can also be used to provide data to support the implementation of new initiatives. One of Mark’s roles last year was to help implement meaningful use at the West Penn Allegheny Health System. For many years, the organization had been trying to roll out an electronic health record (EHR) system to the physicians in the organization. Mark was tapped to get the process moving so the hospital could qualify for level one meaningful use attestation.
The prospect of bringing in millions of dollars per month because of meaningful use made the project extremely visible, noted Mark. His intention was to use a variety of dashboards to present meaningful use data, such as compliance and EHR deployment initiatives, to the health system’s leadership. “It all looked great on paper,” Mark said. But when he ran the reporting tool to create the dashboards, Mark was in for a surprise. “What I discovered to my horror was that only three providers in the organization were ready to attest [to meaningful use],” said Mark.
When Mark and his team drilled down to look at the measures that providers were failing on, they realized there were several problems with the tools and processes for achieving meaningful use.
“Just because you design a complicated maze for people to go through, it doesn’t mean they are going to go through it,” Mark said. “This goes back to understanding behaviors.”
To resolve these challenges, Mark and his team aligned workflow processes and metrics with behaviors. They had to look at the section of Yoshida’s iceberg that was below the water line to see what everyone else was doing in the workflow, noted Mark. Once that piece was in place, Mark and his team mapped out processes, alerts, metrics, and indicators that moved the organization toward meeting meaningful use criteria.