ORIGINAL ARTICLE |
|
Year : 2017 | Volume
: 8
| Issue : 1 | Page : 13 |
|
Performance of a web-based method for generating synoptic reports
Megan A Renshaw1, Scott A Renshaw2, Mercy Mena-Allauca3, Patricia P Carrion4, Xiaorong Mei4, Arniris Narciandi4, Edwin W Gould5, Andrew A Renshaw6
1 Google, NY, USA 2 Northwestern University, Northwestern, Evanston, IL, USA 3 Department of Cancer Services, Baptist Hospital and Baptist Health of South Florida Healthcare System, Miami, FL, USA 4 Department of Information Technology, Baptist Hospital and Baptist Health of South Florida Healthcare System, Miami, FL, USA 5 Department of Pathology, Baptist Hospital and Baptist Health of South Florida Healthcare System, Miami, FL, USA 6 Department of Pathology, Baptist Hospital and Baptist Health of South Florida Healthcare System; Department of Pathology, Baptist Hospital, Miami, FL, USA
Correspondence Address:
Andrew A Renshaw Department of Pathology, Baptist Hospital, 8900 N Kendall Dr, Miami, FL 33176 USA
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/jpi.jpi_91_16
|
|
Context: The College of American Pathologists (CAP) requires synoptic reporting of all tumor excisions. Objective: To compare the performance of different methods of generating synoptic reports. Methods: Completeness, amendment rates, rate of timely ordering of ancillary studies (KRAS in T4/N1 colon carcinoma), and structured data file extraction were compared for four different synoptic report generating methods. Results: Use of the printed tumor protocols directly from the CAP website had the lowest completeness (84%) and highest amendment (1.8%) rates. Reformatting these protocols was associated with higher completeness (94%, P < 0.001) and reduced amendment (1%, P = 0.20) rates. Extraction into a structured data file was successful 93% of the time. Word-based macros improved completeness (98% vs. 94%, P < 0.001) but not amendment rates (1.5%). KRAS was ordered before sign out 89% of the time. In contrast, a web-based product with a reminder flag when items were missing, an embedded flag for data extraction, and a reminder to order KRAS when appropriate resulted in improved completeness (100%, P = 0.005), amendment rates (0.3%, P = 0.03), KRAS ordering before sign out (100%, P = 0.23), and structured data extraction (100%, P < 0.001) without reducing the speed (P = 0.34) or accuracy (P = 1.00) of data extraction by the reader. Conclusion: Completeness, amendment rates, ancillary test ordering rates, and data extraction rates vary significantly with the method used to construct the synoptic report. A web-based method compares favorably with all other methods examined and does not reduce reader usability. |
|
|
|
[FULL TEXT] [PDF]* |
|
 |
|