CME: Continuing Missing Education

The landscape. Continuing medical education (CME) has been under constant criticism and evaluation. In late 2018, the Global Education Group released a report titled “CME Crossroads: A Survey of Continuing Medical Education Analysis, Criticism, Research and Policy Proposals.” The impetus for the report originated from the continued contentiousness on the effectiveness of CME and draws upon hundreds of articles that have been (and continue to be) published on assessing and sharing educational outcomes for CME activities and curricular initiatives in peer-reviewed journals. As physicians are increasingly evaluated upon providing high-quality care under new value-based care paradigms, CME has played an important role in helping physicians remain updated with the latest guidelines. But is CME actually improving patient care, and can we trust that our CME programs are providing unbiased, evidence-based teaching?

We’re not fully convinced. Pharmaceutical and medical device company contributions accounted for 28% of CME funding last year, according to the Accreditation Council for Continuing Medical Education 2017 annual report. Industry donors also contributed nearly $740 million to 1,794 accredited CME providers in 2017, the fourth consecutive yearly increase. These CME courses can run thousands of dollars, and often are held in attractive locations to boost turnout, such as Disneyland or ski resorts. However, despite their cost, these CME courses lack any accountability mechanism for demonstrating change in behavior or improved patient care.

 What Does Effective CME Look Like? 

Interactive learning and feedback. While most CME offerings do not self-assess their long-term, real-world efficacy, researchers Ronald Cervo and Julie Gaines attempted to do just that, measuring how CMEs affect physician performance and patient outcomes in a 2015 systematic review. They found that CME activities that are more interactive, use more methods, involve multiple exposures, are longer, and are focused on outcomes that are considered important by physicians lead to more positive outcomes. Specifically, Interactive methods (audit/feedback, academic detailing, interactive education, and reminders) were the most effective at improving performance and patient health outcomes, while clinical practice guidelines and opinion leaders had a moderate effect, and didactic presentations and printed materials alone had little or no beneficial effect on these outcomes. 

What Can We Do for The Future of CME?

Other industries may offer a clue. Many industries use simulation for lifelong professional training. For example, aviation simulators allow instructors to replicate inflight scenarios that are realistic, but too dangerous to attempt in a real airplane. Though a full-motion simulator may cost up to 50 million, these machines can be operated for a fraction of the cost of the aircraft they replicating.

 It’s time we create simulators for healthcare to value our patients’ lives as highly as we value passengers’ lives. While plane crashes are thankfully rare events, over 250,000 patients are projected to die this year from a misdiagnosis.

QURE’s solution to measure physician practice and provide feedback in a structured-learning CME program has been shown to be effective in improving both physician performance and patient mortality rates. Our simulated patients have been shown to accurately reflect real practice and provide an opportunity for physicians to learn interactively, over repeated exposures with results that stick.  

Here are some case studies of our recent successes:

·      At Penn Princeton, we reduced COPD mortality rates by 28%

·      At Oschner we improved treatment accuracy by 21% in cardiology

Contact us directly at 415-321-3388 or by email to learn more about how our approach of repeatedly measuring physicians’ behavior and providing targeted feedback has helped save patients’ lives, and reduce clinical variation and cos

Othman Ouenes