If you have ever worked in business, been on a sports team, or participated with any kind of group long term, then you will be familiar with “improvement programs.”  And even if you haven’t, you’ve probably read enough about people’s new year’s resolutions on social media to know at least two things:

  1. Improvement can be exciting…
  2. But most people are awful at it!

The quests for improvement can get down-right unpleasant if the “regimen” you’ve committed yourself to is too difficult or too focused on what you haven’t yet achieved.

It’s not any different for Enterprise Imaging.

Having a program that encourages radiologists to learn from their co-workers’ work is, at its core, a very good idea, and has led to an improvement in diagnoses and furthered the education of many radiologists.  However; sometimes the improvement program itself needs some improvement, and Peer Review definitely falls short in a few areas.

Promoting the wrong focus

Many critics of the current peer review standard agree it has too narrow of a focus on the single metric of how often radiologists agree.  This narrowly defined approach to peer review has some radiologists solely focused on getting, or giving, good agreement scores just so they don’t cause legal or management trouble, rather than actually helping others learn.

Could be more thorough

Simply assigning radiologists to review “x” number of studies in addition to their already stress-inducing workload does not promote mindful, growth-inducing feedback. To be effective, a completely new approach to peer review is needed—this is peer learning!

Could be more accurate

The current metric-focused approach to peer review does not promote learning, in fact, the resulting measurements may not even accurate.  Peer Review, as it stands, only measures how often an imaging team agrees with each other, not how often someone is right or how many more cases someone reads correctly this year compared to last year.  It does not capture improvement or reward excellence.  At best, it only communicates deficiency.

In short, Peer Review appears to be more punitive than helpful.

How medQ is making the shift to “Peer Learning”

To make this shift from Peer Review to a Peer Learning process, medQ has developed a software-guided “Peer Learning” program that aims to change the focus of peer review to a true quality improvement process by making three major changes.

All types of studies can be Review Studies

Under Peer Review, the only studies that were given attention were generally the ones where radiologists disagreed.  The Q/ris Peer Learning program starts with this Peer Review basis yet would allow Radiologists to identify two additional types of studies for later review: “Good Catch” and “Interesting Case” studies.

The principle here is that radiologists, like all human beings, can learn from more than just their mistakes—they can learn from their wins.

Develop software to assist “Peer Learning” discussions

The shift in focus provided by the two new kinds of studies is already a huge step toward giving Peer Review a more holistic and quality-driven focus.  medQ takes it a step further by giving that new focus some guided direction.

Enter the “Peer Learning Discussion” concept.

This is what really takes us beyond traditional peer review.  The Peer Learning program allows a discussion leader to look over the cases flagged by the radiologists or other members of the imaging staff (or automatically selected by Q/ris Reporting PLUS+) and select the studies that would most benefit the team to study as a group.  The discussion leader would head a peer learning group to review and analyze selected cases.

As the discussion leader pulls the study, the software automatically accesses the appropriate RIS, EHR, and PACS to gather the accompanying technologist notes, images, and reports necessary for a complete perspective. The leader then guides the imaging team through a group discussion about the study while asking questions like:

  • What made this a “good catch” or an “interesting case?”
  • What benefit would our team take away from this study?
  • Based on what we discussed, is there anyone we need to congratulate?
  • Based on what we discussed is there anyone we need to coach?
  • Based on what we discussed, are there protocol changes from which our team, including the technologists, would benefit?
  • What do we need to record to remember the takeaways?

Action items generated by this discussion would be recorded, assigned to specific team members and then tracked by the group in the Peer Learning Program.

Create an easily accessible learning database

After the group discussion has concluded, a member of the discussion group uses the Peer Learning Program to ensure that a summary of the group’s findings is recorded in a “case study.”  The discussion leader sees to it that the article is categorized and stored in the organization’s very own “Knowledge Base.”  The Knowledge Base, together with all action items and findings, can then be accessed at any time by the imaging staff for future progress tracking or reference.

In Summary

The integrated Peer Learning Program goes well beyond traditional peer review as we take to a positive, quality learning approach to team improvement.  With the addition of tools that facilitate peer learning group discussions and a knowledge base where action items and takeaways can be stored and accessed, the focus has shifted away from simply agreeing, or disagreeing, with one another and instead is about lifting the whole team to a higher quality of care.

The medQ team will be demonstrating the newly integrated Peer Learning Program at the booth #3941 at RSNA 2019, so if you are planning on attending RSNA, be sure to stop by the medQ booth for a walkthrough on what true imaging staff improvement can be.  To make sure all of your questions are answered, you can schedule a time to meet with a team member at RSNA by clicking here.

Not going to RSNA? Ask one of our automation consultants about integrated Peer Learning by calling 800-597-6330 or schedule an online demo by clicking here.