Make peer feedback part of your PACS, urges RCR

2014 06 11 08 44 48 207 Doctor Telephone Advice 200

All radiology information systems (RIS) should have a package that allows peer feedback on reports in a time-efficient fashion, and new RIS/PACS must include an integrated quality assurance (QA) module or provide a facility for electronic integration to a bespoke system, according to new guidance from the U.K. Royal College of Radiologists (RCR).

"Peer feedback should be supported within the clinical governance framework of the department and become part of the radiologists' routine clinical practice; and as such should be recognized in job plans," the authors noted. "A single system of categorizing discrepancies for the purposes of shared learning which is clearly defined and understood should be used."

They think all departments should aim to implement systematic review of 5% of reports within the next four years, and the RCR intends to audit this in 2015 to assess the progress made. A QA level of more than 5% may be required for some retrospective audits analyzing specific parameters, e.g., where accurate discordance rates are needed for a specific subgroup of examinations, such as on-call CT.

Peer feedback should be carried out through a number of processes in a structured framework integrated with multidisciplinary team meetings, discrepancy meetings, and clinical audit, they continued. The validity of radiologist scoring of discrepant reports has been increasingly called into question, but categorization of reports can help to highlight common pitfalls, systematic issues that need addressing, up-skilling needs, etc., according to the authors of the new document (Ref No: BFCR[14]10).

"Peer feedback, as part of the QA program, should be used by radiological service providers to monitor and evaluate their own performance, with the aim of improving quality and patient safety," they wrote. "Until the information from such a system is mature, benchmarking between departments is not advocated as there will be a number of compounding variables that could render the process of little benefit and could potentially produce erroneous conclusions."

Furthermore, the QA program should be integrated into the provider organizations' clinical governance process, with particular reference to serious untoward incident reporting.

Abuse of peer feedback, such as using it for bullying colleagues, must be avoided, and if performed in a supportive learning environment, it can improve patient care and contribute to the evidence for providers and users of the service, noted Dr. Pete Cavanagh, the RCR's vice president for clinical radiology in his foreword to the document.

"Radiologists, like all doctors, need to demonstrate objective assessment of, and reflection on, their practice," he wrote. "Peer feedback can be used as one of the tools to provide this evidence, but also, more importantly, as a result of an individual's review of discrepancies and adverse events, peer feedback has the potential to change professional practice and, in some instances, job plans."

Cavanagh believes the report sets standards for peer feedback, but accepts this is an evolving process and the guidelines will require reviews and updates. He suggests reading the document in conjunction with two other RCR publications: "Standards for learning from discrepancies meetings," and "Cancer multidisciplinary team meeting -- standards for clinical radiologists."

The 18-page document on the peer feedback guidelines can be downloaded for free from the RCR website. For more details, click here.

Page 1 of 1264
Next Page