STEP 7: Develop a Strategy for Collecting and Using QAPI Data
Your team will decide what data to monitor routinely. Areas to consider may include:
This change package is intended for nursing homes participating in the National Nursing Home Quality Care Collaborative led by the Centers for Medicare & Medicaid Services (CMS) and the Medicare Quality Improvement Organizations (QIOs), to improve care for the millions of nursing home residents across the country. The Collaborative will strive to instill quality and performance improvement practices, eliminate Healthcare- Acquired Conditions (HACs), and dramatically improve resident satisfaction by focusing on the systems that impact quality such as: staffing, operations, communication, leadership, compliance, clinical models, quality of life indicators and specific, clinical outcomes (targeted focus on inappropriate antipsychotics in persons living with dementia, falls, pressure ulcers, physical restraints, urinary tract infections, and healthcare- acquired infections).
This change package is focused on the successful practices of high performing nursing homes. It was developed from a series of ten site visits to nursing homes across the country, and the themes that emerged regarding how they approached quality and carried out their work. The practices in the change package reflect how the nursing homes leaders and direct care staff at these sites shared and described their efforts. The change package is a menu of strategies, change concepts, and specific actionable items that any nursing home can choose from to begin testing for purposes of improving residents’ quality of life and care. The change package is intended to be complementary to such resources as literature reviews and evidence-based tools and resources.
We gratefully acknowledge the contributions of the following organizations that so generously shared their time, effective practices, and their experiences which informed the content of this change package.
Action Items
Action Items
Action Items
Directions: A dashboard can be helpful as a way to monitor the progress of QAPI in your organization, or the progress of individual projects. The complexity of a dashboard can vary based on the needs of the organization and whether or not you have an automated system to assist in pulling data into the dashboard. Your team should use this tool to guide the process of developing a dashboard. The steps below are intended to help the team members understand the value of a dashboard and the process for creating a dashboard.
Step 1 – Review dashboard basics:
What is a dashboard?
Like the panel of signals that allow a driver to monitor the functioning of a car, a dashboard is a system to track key performance indicators within an organization. It is meant to be designed so that it is easy to read and quick to understand, providing signals of where things are going well and where there are problems to address. It should include short term indicators – to make sure that milestones are being met, and outcome measures that reflect whether goals are being met.
Why is a dashboard important?
Regular monitoring of data is critical for effective decision-making in any organization. At the same time, the amount of data available can be overwhelming and long data reports containing all possible information are not likely to be used and may not be meaningful. A dashboard is an ideal way to prioritize the most important indicators for a particular organization and encourage regular monitoring of the results.
What does a dashboard look like?
Dashboards may be simple text documents, data spreadsheets, or sophisticated graphs developed with computer programs. Data results are reported for multiple time periods to show trends over time and include benchmarks or goals to put performance into context. An organization’s main dashboard ideally fits onto one page, showing only a select set of the most important indicators to monitor. Sub-dashboards may then be created so that users can “drill-down” to see more detailed data on a specific issue. Dashboards typically employ a system of visual alerts—such as red-yellow-green stoplight coloring, speedometers or thermometers—that help to draw viewers’ attention to data results indicating an area for concern.
Step 2 – Decide how your dashboard will be used:
What type of dashboard do we need?
Different dashboards may be useful for different audiences. For instance, a dashboard geared to a board of directors would need to monitor not only the overall quality and specific clinical or organizational quality indicators for the facility but also its financial health. Similarly a top administrator needs to have a high-level view of the performance of the organization, while an individual staffing unit may have a dashboard that concerns the area of care for which it is specifically responsible. Additionally, you may decide to create a dashboard that distinctively monitors the success of a particular QAPI activity. Ideally, any sub-dashboards created will be tied to the main organizational dashboard so that all efforts are working in sync with the overarching vision and goals of the organization.
Step 3 – Create your dashboard:
The following is a list of steps to consider in developing a dashboard. These steps are not necessarily listed in the order in which they must take place, but represent a general path to follow in creating a dashboard panel for your organization.
Step 4 – Use your dashboard:
Step 5 – Revisit your dashboard:
Remember that a dashboard is a living tool and, therefore, should evolve over time. Establishing regular review periods will help to prevent the dashboard from becoming stagnant and growing obsolete by considering new data sources that have since become available and identifying indicators that are no longer considered useful.
Monitor whether the data collected and shared are acted upon by leadership, the quality committee, and others as appropriate. Remember that simply tracking and trending data will not lead to meaningful change in the lives of residents.
Continue to look for new and innovative indicators to include in your dashboard. The purpose of a dashboard is to challenge your organization not only to meet its goals but to continue to improve and grow in different ways.
Directions: For each measure/indicator that you choose to collect and monitor for QAPI, answer the following questions. The information gleaned from these questions will help you determine how best to track, display and assess or evaluate the results of the various measures you have chosen for QAPI. If you have a relatively small number of measures or indicators that you are tracking, you may wish to include all measures in one table and use this as an overview tool that could be completed by the person coordinating QAPI in your organization. Alternatively, you may choose to use this table for individual measures or groupings of measures that address similar topics.
What are we measuring (measure/indicator)? | When are we measuring this (frequency)? | How do we measure this (where do we get our data)? | Who is responsible for tracking on this measure? | What is our performance goal or aim? | How will data findings be tracked and displayed? |
---|---|---|---|---|---|
Example: High risk pressure ulcers | Quality Indicator (QI) monthly report | Data comes from MDS assessments | DON | <6% | DON uses Excel run chart template to document monthly rates over time. DON also tracks and graphs the number of in house acquired versus admitted pressure ulcers, pressure ulcers by stage, and time to heal. Results are provided to QAPI committee and posted in “North” conference room. |
Example: Staff satisfaction | Yearly – April | Corporate satisfaction survey | Administrator | Participation rate: 70% Overall satisfaction: xx% | Administrator uses bar chart to show results for individual satisfaction |
What are we measuring (measure/indicator)? | When are we measuring this (frequency)? | How do we measure this (where do we get our data)? | Who is responsible for tracking on this measure? | What is our performance goal or aim? | How will data findings be tracked and displayed? |
---|---|---|---|---|---|
Would recommend as place for care: xx%
Would recommend as place to work: xx% |
questions and key composite measures for current and previous 3 years. Results are provided to QAPI committee and posted in “North” conference room. | ||||
Example: Staff turnover | Monthly and annualized | Human resources department | Human Resources Director | <20% | Human Resources Director uses the Advancing Excellence in America’s Nursing Homes “Monitoring Staff Turnover Calculator.” Results reviewed at QAPI committee. |
What are we measuring (measure/indicator)? | When are we measuring this (frequency)? | How do we measure this (where do we get our data)? | Who is responsible for tracking on this measure? | What is our performance goal or aim? | How will data findings be tracked and displayed? |
---|---|---|---|---|---|
Directions: Use this worksheet to develop a performance measure/indicator. A new measure/indicator might be created as part of your overall QAPI monitoring or for a Performance Improvement Project. You will likely want to use existing measures when possible, but there may be times when you want to develop a new measure/indicator that is specific to your needs.
Note: What is the difference between an indicator and a measure? An indicator provides evidence that a certain condition exists but does not clearly identify the situation or issue in any detail. Indicators enable decision-makers to assess progress towards the achievement of intended outputs, outcomes, goals, and objectives. A measure is a stronger reflection of the underlying concept; a more developed and tested way of describing the concept that is being evaluated. However, in practice the two terms are used interchangeably.
NAME OF MEASURE/INDICATOR:
Example: Residents with a completed skin assessment within 12 hours of admission.
PURPOSE OR INTENT FOR MEASURE/INDICATOR:
Example: The purpose of this measure is to make sure our process of completing a skin assessment within 12 hours of admission is done consistently.
MEASURE/INDICATOR TYPE:
__ Structural Measure: Structural measures focus on the fixed characteristics of an organization, its professionals and staff. These measures distinguish between a capability or asset and the activity that may rely on that structure. In addition, structural measures are typically based on the organization or professional as the unit of assessment in the denominator. Example: The extent to which a facility use of electronic health records is implemented facility-wide. Numerator = Number of departments with EHR; Denominator = Number of all departments in facility.
__ Process Measure: Process measures assess the steps or activities carried out in order to deliver care or services. These measures focus on the action by professionals and staff. Consideration should be given to sample sizes for denominators, exclusion criteria, and alternative processes or work-arounds that may exist. Example: The percentage of newly admitted residents receiving admission skin assessments.
__ Outcome Measure: Outcome measures focus on the product (or outcome) of a process or system of care or services, which can identify different or more complex underlying causes. Example: The rate or incidence of nursing home acquired pressure ulcers.
The measure in the example above (residents with a completed skin assessment within 12 hours of admission) is a process measure.
NUMERATOR:
(i.e., when will a person or event be counted as having met the desired result – this is the top number of the fraction you will calculate) |
Example: any resident with a completed skin assessment within 12 hours of admission Numerator: 19 |
DENOMINATOR:
(i.e., what is the total pool of persons or events you will be counting – this is the bottom number of the fraction you will calculate) |
Example: all residents admitted in last month. Denominator= 23 |
EXCLUSION CRITERIA:
(i.e., is there any reason you would exclude a particular person or event from the denominator count?] |
Example: exclude those residents in the nursing home for less than 24 hours because all assessment data not available Denominator after exclusions: 20 |
RESULT CALCULATION:
(i.e., typically expressed as Numerator/Denominator x 100 = rate %) |
Example: 19 / 20 X 100 = 90% |
INDICATOR/MEASURE GOAL:
(i.e., the numerical goal aimed for – may be based on an already- established goal for the particular indicator) |
Example: Goal = 100% |
INDICATOR/MEASURE THRESHOLD:
(i.e., the minimum acceptable level of performance) |
Example: Threshold = 95% |
DATA SOURCE: | Example: Medical records, admission skin assessment form |
SAMPLE SIZE AND METHODOLOGY:
(i.e., will you measure the total population under study or draw a sample to represent the whole? If sampling, how large will the sample size be? How will you determine the sample?) |
Example: The total population admitted in the last month who were in the nursing home for at least 24 hours will be reviewed. |
FREQUENCY OF MEASUREMENT:
(i.e., how frequently will the indicator result be calculated: daily, weekly, monthly, quarterly, annually?) |
Example: Monthly |
DURATION:
(i.e., what is the timeframe for which the data will be collected: number of cases/events in the past weeks, months, quarters? This will depend on how frequently cases/events occur.) |
Example: Will collect this data for three consecutive months; then based on findings, will either develop corrective action and continue monitoring monthly, or consider decreasing frequency of monitoring. |