NCT03119532

Brief Summary

Over 40 million major operative procedures are performed in the US annually and comprise about 40% of healthcare expenditures. Despite decades of research, perioperative mortality and morbidity remain a major healthcare system cost and detriment to long-term quality of life. More than ten percent of patients experience a significant event such as surgical site infection, reoperation, myocardial infarction, pulmonary embolus, or death. Nearly 100,000 patients die after surgery each year. National data demonstrate a 3-fold variation in risk adjusted surgical morbidity and mortality, suggesting many opportunities for improvement in perioperative care. Anesthesiology care demonstrates wide variation in practice. Sometimes, this variation is appropriate because the anesthesiologist is responding to patient comorbidities or procedure specific events. However, even after controlling for patient specific factors, there is a substantial amount of unexplained variation in fundamental elements of anesthesiology care. The same procedure and patient can be performed using completely different anesthetic techniques, hemodynamic management strategies, and medications. This variation in care can lead to a variation in outcome. The use of electronic health records (EHR) with detailed preoperative and intraoperative data allows an automated system to be developed to notify clinicians their compliance to both process of care metrics and outcome metrics. The Multicenter Perioperative Outcomes Group (MPOG) quality improvement arm is known as Anesthesiology Performance Improvement Reporting Exchange (ASPIRE). Like other Collaborative Quality Initiatives, the primary goal of ASPIRE is to provide hospitals with risk-adjusted feedback on outcome and process of care variation. In addition, ASPIRE creates an active best-practice sharing environment to enable data to spur action. Recent literature has demonstrated that hospital-level feedback may not be adequate to improve performance and clinical outcomes. In addition to hospital level data and feedback, ASPIRE can disseminate provider-specific electronic feedback that may decrease variation in care known to impact complications and cost. The primary aim for this research study on ASPIRE's QI program is to determine if the investigators can change behavior as measured by a provider's compliance to specific performance metrics. The investigators believe that the start of individual provider performance feedback reports to ASPIRE members presents a unique opportunity to research the efficacy of these novel tools. The investigators propose to test the hypothesis that monthly provider specific feedback emails on ASPIRE quality metrics over a period of 9 months improves provider compliance as measured by a either a 10% improvement in the Total Performance Score or by moving from below to above the 90% performance threshold in the Total Performance Score Index. Each provider type (faculty, CRNA, resident/fellow) within a hospital participating in ASPIRE will be individually randomized to either receiving the electronic performance improvement email or not for a total of nine months. No individual at the participating site will see the individualized email compliance reports except for the specific provider. Only an aggregate of the compliance across the entire hospital will be supplied to the chairperson and the quality assurance directors. After the completion of the nine month randomization period, all providers will receive monthly ASPIRE performance improvement emails. The University of Michigan is the coordinating center but also participating in this research on QI project. De-identified patient data will be pulled in aggregate for each provider using the MPOG database. The provider performance for each measure will then be sent from ASPIRE to the randomized care provider via an email. The chairperson and quality assurance directors will only see aggregate data on compliance rates and can NOT identify individual compliance rates. Each participating site will obtain their own institutional IRB to participate in this study.

Trial Health

100
On Track

Trial Health Score

Automated assessment based on enrollment pace, timeline, and geographic reach

Enrollment
672

participants targeted

Target at P75+ for not_applicable

Timeline
Completed

Started Jul 2015

Status
completed

Health score is calculated from publicly available data and should be used for screening purposes only.

Trial Relationships

Click on a node to explore related trials.

Study Timeline

Key milestones and dates

Study Start

First participant enrolled

July 1, 2015

Completed
1.4 years until next milestone

First Submitted

Initial submission to the registry

November 22, 2016

Completed
1 month until next milestone

Primary Completion

Last participant's last visit for primary outcome

January 1, 2017

Completed
Same day until next milestone

Study Completion

Last participant's last visit for all outcomes

January 1, 2017

Completed
4 months until next milestone

First Posted

Study publicly available on registry

April 18, 2017

Completed
Last Updated

April 19, 2017

Status Verified

April 1, 2017

Enrollment Period

1.5 years

First QC Date

November 22, 2016

Last Update Submit

April 17, 2017

Conditions

Outcome Measures

Primary Outcomes (1)

  • Number of providers with improved compliance (for all clinical providers) with Anesthesia Quality Measures using an email based provider specific feedback system

    Investigating improved bundle compliance for all providers where all anesthesia care providers at the given institution were randomized to receive emails (attending/residents/CRNAs).

    9 months

Other Outcomes (6)

  • Number of providers with improved compliance (where only one set of clinical providers) with Anesthesia Quality Measures using an email based provider specific feedback system

    9 months

  • Number of providers with improved compliance for providers that already met the threshold compliance with Anesthesia Quality Measures using an email based provider specific feedback system

    9 months

  • Number of providers with improved compliance for all sites except the coordinating center with Anesthesia Quality Measures using an email based provider specific feedback system

    9 months

  • +3 more other outcomes

Study Arms (2)

Receive feedback email

EXPERIMENTAL

Anesthesia care providers who receive monthly feedback emails on the participants specific quality measures

Other: Receive metric feedback email

Did not receive feedback email

NO INTERVENTION

Anesthesia care providers who did NOT receive monthly feedback emails on the participants specific quality measures

Interventions

If the provider received an email about the participants performance metrics

Receive feedback email

Eligibility Criteria

Sexall
Healthy VolunteersYes
Age GroupsChild (0-17), Adult (18-64), Older Adult (65+)

You may qualify if:

  • Hospitals currently participating in ASPIRE (https://www.aspirecqi.org/)
  • Quality assurance champion and chairperson have decided to participate in this quality assurance project.

You may not qualify if:

  • Hospitals not currently participating in ASPIRE

Contact the study team to confirm eligibility.

Sponsors & Collaborators

Related Links

Study Officials

  • Sachin Kheterpal, MD, MBA

    University of Michigan

    STUDY DIRECTOR

Study Design

Study Type
interventional
Phase
not applicable
Allocation
RANDOMIZED
Masking
NONE
Purpose
OTHER
Intervention Model
PARALLEL
Sponsor Type
OTHER
Responsible Party
PRINCIPAL INVESTIGATOR
PI Title
Assistant Professor

Study Record Dates

First Submitted

November 22, 2016

First Posted

April 18, 2017

Study Start

July 1, 2015

Primary Completion

January 1, 2017

Study Completion

January 1, 2017

Last Updated

April 19, 2017

Record last verified: 2017-04

Data Sharing

IPD Sharing
Will not share