Journal of Pathology Informatics Journal of Pathology Informatics
Contact us | Home | Login   |  Users Online: 1703  Print this pageEmail this pageSmall font sizeDefault font sizeIncrease font size 




 
Table of Contents    
RESEARCH ARTICLE
J Pathol Inform 2019,  10:18

Improving medical students' understanding of pediatric diseases through an innovative and tailored web-based digital pathology program with philips pathology Tutor (Formerly PathXL)


1 University of Pittsburgh School of Medicine, Pittsburgh, PA, USA
2 UPMC Pathology Informatics, PA, USA
3 Department of Pathology, University of Pittsburgh School of Medicine, UPMC Presbyterian Hospital, Pittsburgh, PA, USA
4 Department of Pathology, University of Pittsburgh School of Medicine, UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA, USA

Date of Submission02-Mar-2019
Date of Acceptance09-May-2019
Date of Web Publication18-Jun-2019

Correspondence Address:
Dr. Jennifer L Picarsic
Department of Pathology, UPMC Children's Hospital of Pittsburgh, 4401 Penn Ave., B260 Main Hospital, Pittsburgh, PA 15224
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/jpi.jpi_15_19

Rights and Permissions
   Abstract 


Background: Online “e-modules” integrated into medical education may enhance traditional learning. Medical students use e-modules during clinical rotations, but these often lack histopathology correlates of diseases and minimal time is devoted to pathology teaching. To address this gap, we created pediatric pathology case-based e-modules to complement the clinical pediatric curriculum and enhance students' understanding of pediatric diseases. Methods: Philips Tutor is an interactive web-based program in which pediatric pathology e-modules were created with pre-/post-test questions. Each e-module contains a clinical vignette, virtual microscopy, and links to additional resources. Topics were selected based on established learning objectives for pediatric clinical rotations. Pre- and post-tests were administered at the beginning/end of each rotation. Test group had access to the e-modules, but control group did not. Both groups completed the pre/post-tests. Posttest was followed by a feedback survey. Results: Overall, 7% (9/123) in the control group and 8% (13/164) in the test group completed both tests and were included in the analysis. Test group improved their posttest scores by about one point on a 5-point scale (P = 0.01); control group did not (P = 1.00). Students responded that test questions were helpful in assessing their knowledge of pediatric pathology (90%) and experienced relative ease of use with the technology (80%). Conclusions: Students responded favorably to the new technology, but cited time constraints as a significant barrier to study participation. Access to the e-modules suggested an improved posttest score compared to the control group, but pilot data were limited by the small sample size. Incorporating pediatric case-based e-modules with anatomic and clinical pathology topics into the clinical medical education curriculum may heighten students' understanding of important diseases. Our model may serve as a pilot for other medical education platforms.

Keywords: Medical education, online modules, pediatric pathology, Philips Tutor, virtual microscopy


How to cite this article:
Chen CP, Clifford BM, O'Leary MJ, Hartman DJ, Picarsic JL. Improving medical students' understanding of pediatric diseases through an innovative and tailored web-based digital pathology program with philips pathology Tutor (Formerly PathXL). J Pathol Inform 2019;10:18

How to cite this URL:
Chen CP, Clifford BM, O'Leary MJ, Hartman DJ, Picarsic JL. Improving medical students' understanding of pediatric diseases through an innovative and tailored web-based digital pathology program with philips pathology Tutor (Formerly PathXL). J Pathol Inform [serial online] 2019 [cited 2019 Nov 15];10:18. Available from: http://www.jpathinformatics.org/text.asp?2019/10/1/18/260626




   Introduction Top


Online modules covering a range of topics have been increasingly integrated into medical education to supplement lectures and optimize learning. They are efficient and effective because they support self-directed learning and present information in a different modality.[1] Online modules that are case-based are particularly effective for medical students because they are engaging, promote active learning, apply information to clinical scenarios, and reinforce important concepts.[2] Multiple studies have demonstrated that online modules are effective at teaching various clinical topics to medical students and residents and are associated with improved knowledge.[3],[4],[5],[6],[7] However, there are limited studies evaluating the use of online modules to teach pathology to medical students[7],[8] and none, that we are aware of, specifically address pediatric pathology during clinical rotations.

At the University of Pittsburgh School of Medicine (UPSOM), medical students have access to interactive virtual patient cases that reinforce the Council on Medical Student Education in Pediatrics (COMSEP) learning objectives[9] about core pediatric topics during their inpatient and outpatient pediatric clerkships. However, they often lack information about the histopathology of diseases even though the understanding of disease rests on a thorough awareness of the underlying pathology. In addition, there is minimal time devoted to pathology during the clinical clerkship rotations, further contributing to a knowledge gap in this area. To address this gap, we used Tutor, an online virtual pathology program, to create online histopathology case-based modules based on the COMSEP learning objectives. Our goal was to assess if exposure to interactive virtual histopathology case-based modules would increase medical students' knowledge of key pediatric pathology concepts through gross images and whole slide digital images (WSI), as compared to a control group without access to these e-modules. The goal of this pilot study was to evaluate whether the online histopathology case-based interactive modules improved medical students' knowledge of key pediatric pathology topics using pretest and posttest assessments. Our hypothesis was that medical students who had access to the Tutor modules would have higher posttest scores than those without access to them.


   Methods Top


This nonrandomized prospective study was conducted at the UPSOM during the pediatric clerkships from 2017 to 2018. It was approved by the UPSOM Research on Medical Students Review Committee and University of Pittsburgh Institutional Review Board (IRB number PRO16100447).

Participants

UPSOM medical students have two pediatric clerkships: A 4-week pediatric inpatient medicine clerkship and an 8-week combined ambulatory medicine and pediatrics clerkship. Third-year and fourth-year medical students were asked to voluntarily participate in this study after they received a live introduction of the study during their orientations for the pediatric clinical rotations/clerkships. However, students did not have real-time access to the technology during the live introduction. An E-mail with information about the study and instructions on how to participate was sent to medical students after the live introduction. A University of Pittsburgh Medical Center (UPMC) administrator (BC, MO) sent a second E-mail with unique deidentified usernames and passwords and a link to access the tests and modules. Medical students were informed that their participation was voluntary and test results were confidential and would not affect their pediatric clerkship grades. It was estimated to take 15–30 min to complete all parts of the study. Medical students were divided into two groups: a control group and a test group. The control group did not have access to the Tutor modules, whereas the test group had access to the Tutor modules after completing the pretest. Both groups were required to complete the pretest and posttest for inclusion into the study. To ensure academic equity and prevent one medical school class (i.e., same academic year) from having an academic advantage over another class, the control and test groups were not randomized but rather included medical students from different academic years: the control group was medical students from January to April 2017 and May to August 2018 and the test group was medical students from May 2017 to August 2018. In addition, both groups consisted of medical students at different points in their clinical years to limit the effect of the timing of the pediatric clerkships on test scores.

Technology

Tutor (Philips Pathology, Amsterdam, Netherlands), formerly PathXL, is an internet-based virtual microscopy program, licensed through the UPSOM Department of Pathology, Division of Informatics. It is a user-friendly and secure password-protected program behind the UPMC firewall. A username and password are required to log into the website and access the program at https://pathxl.upmc.com. It is free for medical students to use, and there are no programs to download. The Tutor is accessible on both personal computers and Macintosh desktops, laptops, and tablets but is not enabled for smartphone devices.

Module design

Five online pediatric pathology case-based modules were created in Tutor. High-resolution WSIs were scanned and uploaded to Tutor at ×20 magnification. Each module that was created (JP, CC) contained a clinical vignette with de-identified histories, macroscopic (gross) images, and digital histopathology slides. There were detailed descriptions of the macroscopic pathology images and digital histopathology slides stained with hematoxylin and eosin (H and E), explanations of differential diagnoses and links to relevant articles for additional reading. Certain cases also had additional pertinent stains provided to enhance insight into the diagnosis of the disease, each with slide-guided annotations. These included WSI of histochemical stains (i.e., elastic trichrome and periodic acid-Schiff [PAS] stains) as well as uploaded jpeg images of immunofluorescence and electron microscopy (EM) images for certain cases. By virtue of its general case-based approach, many of the cases also contained elements of laboratory testing; highlighting aspects of clinical pathology, integrated with anatomic pathology [Supplemental Text]. Users navigated the WSI with the ability to magnify images up to ×20 magnification by scrolling, as if they were using a virtual microscope. There were thumbnail images to maintain the orientation of the slides and slide-specific annotations created to direct users to areas of interest. [Figure 1] and [Figure 2] show WSI and macroscopic (gross) images for one of the cases (lungs of cystic fibrosis case with bronchiectasis removed at the time of transplantation).
Figure 1: Screenshots of whole slide digital images for a cystic fibrosis case with annotations highlighting areas of interest for one of the e-modules with Tutor. (a) Digital hematoxylin and eosin stain. (b) Digital elastic trichrome stain

Click here to view
Figure 2: Screenshot of macroscopic (gross) images of lungs for a cystic fibrosis case with annotation and descriptions provided in the e-modules with Tutor

Click here to view


Module topics

The module topics were selected based on an established list of pediatric topics and learning objectives set by the COMSEP[9] and UPSOM pediatric clerkship directors. The topics complemented the online case-based modules on Aquifer Pediatrics (Lebanon, New Hampshire, USA)[10] that UPSOM medical students are required to complete during the pediatric clerkships. Aquifer Pediatrics has several interactive virtual patient cases that correlate with the COMSEP clerkship curriculum learning objectives to develop clinical reasoning skills.[9],[10] The selected topics and their differential diagnoses were created in five Tutor modules by a board-certified pediatric pathologist (JP) and UPSOM medical student (CC). Medical students were expected to know these topics and learning objectives after finishing their pediatric clerkships; therefore, the selected cases were ones that students may not have had direct clinical exposure to during their clinical rotations.

Module case design

The cases presented in the modules were based on five deidentified patient cases evaluated by the Department of Pathology at UPMC Children's Hospital of Pittsburgh. They included the following: lung foreign body (diagnosis: aspiration with endobronchial lung biopsy), vasculitis (diagnosis: Henoch–Schönlein purpura with skin biopsy with IgA immunofluorescent [IF] image and ancillary renal biopsy), toxic ingestion (diagnosis: acetaminophen overdose with liver transplant), failure to thrive with lung disease (diagnosis: cystic fibrosis with lung transplant), and glomerulopathies (diagnosis: minimal change disease with renal biopsy and EM image). Additional details of the cases are provided herein: Case 1 was a pediatric patient with delayed gastric emptying who aspirated a vegetable bezoar, with a gross picture provided in the module. The WSI of an endobronchial lung biopsy demonstrating foreign body giant cell reaction, along with WSI and descriptions of PAS, elastic trichrome, and cytokeratin 7 stains were also provided in the module. Case 2 was a pediatric patient who presented with purpura, proteinuria (urine protein to creatinine ratio of 1.9), and microscopic hematuria and was diagnosed with Henoch–Schönlein purpura with a skin biopsy WSI and accompanying IgA IF image provided. Clinical pictures of the purpura localized to the finger and ankle were provided in the module, along with ancillary WSI of PAS, Jones (silver), and Masson trichrome stains and EM of the kidney with descriptions of each to highlight the nephropathy changes in such a case. Case 3 was a pediatric patient who presented with abdominal pain and vomiting, who was found to have significantly elevated liver enzymes including AST/ALT and bilirubin and alkaline phosphatase after acetaminophen overdose/toxicity and developed fulminant liver failure requiring a liver transplant. A macroscopic image (liver at transplant), WSI of the native necrotic liver, and WSI with Periodic acid–Schiff–diastase (PAS-D) and reticulin stains were also provided in the module. Case 4 was a pediatric patient who presented with chronic cough, recurrent pulmonary infections, and nasal polyps and was clinically diagnosed with cystic fibrosis as an infant with meconium ileus and positive sweat test. Native lung resection at the time of transplantation and WSI demonstrating bronchiectasis with mucus plugs were provided in the module. Case 5 was a pediatric patient who presented with periorbital edema, proteinuria (>3.5 g/24 h), low albumin, and hyperlipidemia and was diagnosed with minimal change disease on renal biopsy. The WSI of kidney biopsy with PAS and Jones (silver) stains showed no significant changes, while the EM image revealed the diffuse podocyte foot processes effacement. A normal renal EM image was also provided for comparison. The e-modules' discussion sections covered the pathology-specific differential diagnoses associated with each of these cases. Ancillary reading materials were also provided for the cases.

Pretest/posttest

A multiple-choice examination with five questions was created and administered using Tutor. The questions were based on the five online histopathology case-based modules and their differential diagnoses. The questions included a clinical presentation associated with a WSI. The digital slides also included annotations to direct students to areas of interest. Selected cases also had an ancillary image (i.e., IF, EM). The tests were used to evaluate medical students' pediatric pathology knowledge before and after their pediatric clinical rotations (with at least a 3–4-week “washout” period between tests) for both the test group (i.e., with exposure to the Tutor e-modules) and control group (i.e., no exposure to the Tutor e-modules). The pretest was open at the beginning of the clerkship for 1 week and the posttest was open at the end of the clerkship for 1 week. The pretest and posttest questions were identical. Correct answers were not provided after the pretest. After completing the posttest, the incorrect answer explanation and correct answer explanation were both given (i.e., if the incorrect answer was selected) or only the correct answer explanation (i.e., if correct answer was selected). In addition, a more detailed overview of the disease entity, including relevant pathophysiology with clinical and anatomic pathology details, differential diagnosis, and references for further study was also provided [Supplemental Text].

Medical student surveys

An exit-interview survey with seven questions was created and administered using Tutor after completion of the posttest in order for medical students to evaluate the quality of test questions, cases, and Tutor program on a five-point Likert scale[11] with the following options: strongly disagree, disagree, neither disagree nor agree, agree, and strongly agree. Both groups had access to the survey after the posttest.

After completion of the entire study, an additional exit survey was created and administered using SurveyMonkey (San Mateo, CA, USA) to further examine the participation rate. The final exit survey helped capture medical students who had not completed the posttest because the exit-interview survey was linked to completion of the posttest, which could have caused questionnaire fatigue. Both surveys were anonymous and voluntary.

Statistical analysis

Data were analyzed using GraphPad Prism 8 (San Diego, CA, USA). Results were presented as mean and median test scores (0–5 total point score) and mean percentages of correct answers (0%–100%). Paired t-test was used to compare the pretest and posttest scores within each group and unpaired t-test was used to compare the pretest and posttest scores between groups. Fisher's exact test was used to analyze the number of medical students who performed worse or the same/better on the posttest compared to the pretest. Exit-interview survey data were presented as mean, standard deviation, and median. Unpaired t-test was used to compare the mean survey ratings between groups. The frequency of ratings on the survey was combined into two categories: strongly disagree/disagree and neutral/agree/strongly agree. Fisher's exact test was used to compare the frequencies of the two categories of ratings between groups.


   Results Top


Pretest/posttest results

There were 123 eligible medical students in the control group and 164 eligible medical students in the test group. Twenty-six (21%) medical students in the control group and 35 (21%) medical students in the test group completed the pretest. Of those completing the pretest, 9 (35%) in the control group and 13 (37%) in the test group also completed the posttest [Figure 3]. Overall, 7% (9/123) in the control group and 8% (13/164) in the test group completed both tests and were included in the analysis.
Figure 3: Study flow diagram of the participants

Click here to view


The pilot results showed that test group had a slightly higher posttest scores (mean: 4.31/5) as compared to the control (mean: 3.38/5) and the test group improved their pre- to post-test score by about one more correct answer [P = 0.01, [Table 1] and [Figure 4], while no significant difference was seen in the control group [P = 1.00, [Table 1] and [Figure 4]. Furthermore, 33% of the control group performed worse on the posttest as compared to the pretest, while only 8% of the test group performed worse on the posttest [P = 0.26, [Table 2].
Table 1: Comparison of mean pretest and posttest scores for control and test groups

Click here to view
Figure 4: Comparison of pretest and posttest scores for control and test groups as mean percentage of correct answers with standard deviation. *Test group mean percentage of correct answers posttest (86%) score was significantly higher than pretest (68%) score (P = 0.01, t-test). nsControl group mean percentage of correct answers pretest (76%) and posttest (76%) scores were not statistically significant (P = 1.00, t-test)

Click here to view
Table 2: Number of medical students who performed worse or same/better on the posttest as compared to the pretest for control and test groups

Click here to view


Survey results

Of those enrolled in the study, eight out of nine (89%) in the control group and 12 out of 13 (92%) in the test group completed the exit-interview survey. One medical student in the control group skipped the second question. There were no significant differences in the mean survey ratings on a 5-point Likert scale between groups [Supplemental Table 1 [Additional file 2]].

The survey ratings were combined into two categories: opposed/negative (i.e., strongly disagree/disagree) and neutral/positive (i.e., neutral/agree/strongly agree) [Table 3]. The majority responded neutral/positively that the test questions improved their understanding of pediatric diseases (75%) and test questions were helpful in assessing their knowledge of the pediatric pathology (90%), and relative ease of use for the Tutor program (80%). The majority (74%) felt that the quizzes complemented some of the pediatric diseases covered in the required online clinical modules and that the quizzes were/will be helpful when studying for the pediatric shelf examination (70%). However, there were no statistically significant differences between the test and control groups in how they rated these questions [Supplemental Table 1 [Additional file 4]] and [[Supplemental Table 2 [Additional file 3]]. One exception was the slightly more favorable response given by the test group for how they rated whether the pre-/post-tests improved their understanding of pediatric diseases [P = 0.05, [Supplemental Table 1 [Additional file 5]]. Medical students reported that they mostly used review books (65%), question banks (100%), and websites (50%) to study for the pediatric shelf examination. In contrast, fewer medical students reported using textbooks (10%), syllabi (10%), the Tutor tests (5%), and other online modules (20%) for their supplemental study aids.
Table 3: Survey ratings presented as frequency of ratings, mean, standard deviation, and median of 5-point Likert scale

Click here to view


Of the eight medical students who completed the final exit survey, four had participated in the study and four did not participate. Of the four who did not participate in the study, three reported that they did not participate because they perceived it was going to take too much time, and there were too many required assignments during the pediatric clerkships. In addition, three reported that they would have participated in the study if they were provided dedicated time during the clerkship.


   Discussion Top


This pilot study demonstrates that the tailor-made online “e-modules” with digital pediatric pathology slides, macroscopic images, and related clinical case-based design can serve as a useful ancillary educational tool for exposing pediatric pathology to medical students during their clinical rotations. However, during a busy clinical rotation, either dedicated time or mandatory requirement must be set in place in order for students to engage with such technology. Our pilot data with a control group may suggest that such online learning could help improve medical students' understanding of key pediatric diseases with relative ease of use. However, while these pilot data with the implementation of a control group are encouraging, we caution the strength of the data given the very low participation rate (<10%) and high attrition rate among those students starting the pretest, but not finishing the posttest. Thus, while the data would suggest that access to the e-modules may enhance specific pathology-centered knowledge, the primary outcome was based on test scores in Tutor, and we did not have access to medical students' pediatric National Board of Medical Examiners shelf examination scores, making it difficult to infer whether improved pathology knowledge affected the end of rotation examination scores.

A majority of medical students (≥70%) reported neutral or favorable ratings on ease of use for Tutor and its ability to help improve their understanding and to assess their knowledge in pediatric disease and pathology, while also complementing their clinical teaching and other study aids for pediatric diseases. The test group rating for whether the pre-/post-tests improved their understanding of pediatric diseases was slightly higher than the control group and was approaching a statistically significant difference, suggesting that their exposure to the e-modules may have increased their favorability and confidence in answering the posttest questions. Of note, 70% of medical students thought that the Tutor tests would be helpful when studying for the pediatric shelf examination, but only 5% actually reported using them. Although the sample size was small due to a low participation rate, these data suggest that the Tutor modules could be an effective educational tool as a supplemental activity during the pediatric clinical rotation, if given dedicated time to engage with the new technology is provided to students. Furthermore, the results would also suggest that students have favorable feelings toward the new pediatric pathology educational technology, but engagement requires a supportive environment with dedicated time investment and established expectations set forth by educational leadership in order for student to regularly engage in such technology.

The strengths of our study are the inclusions of a pretest to establish baseline knowledge and a control group to compare the effect of exposure to the Tutor modules in the test group. Of note, the baseline pretest scores were not significantly different between groups, suggesting that both groups had a similar level of baseline knowledge. While the improved posttest scores in the test group are mainly attributed to the exposure of the Tutor modules, we recognize that a multitude of other factors could be confounding, especially given the low number of participants and few overall questions asked in the study. Nevertheless, the small, but significant, finding between the pre-/post-tests of the control and test groups is encouraging. We assumed that the test group reviewed all five e-modules; however, we were unable to track students' actual progress through and completion of all five e-modules given the voluntary status and inability to track this more granular detail on a case-by-case process (other than posttest completion), which is a limitation in our study design and technology. Other programs[12] have the ability to track user date and time of access, which can help gauge student interactions with the technology. Since the tests were online and not proctored, we cannot be certain if the test group may have been more inclined than the control group to use external resources to help answer the questions even though they were advised not to. The timing of the pediatric clinical rotations could have affected test scores because medical students may learn or be exposed to the diseases covered in the modules during other clerkships or electives. To circumvent this, we tried to closely align control and test groups to include medical students both at the beginning and end of their clinical years in both groups. Finally, there was the risk of pretest sensitization because the pretest and posttest had identical questions, but recall bias was reduced by only providing answers after the posttest, along with an at least 3–4 weeks washout period between pre-/post-tests.

The main limitation of our study was the small sample size due to a low participation rate with fewer than 10% of eligible medical students completing both tests. Given that there were limited data available, the results may have been skewed, making it is difficult to draw definitive conclusions about the efficacy of the e-modules as a learning tool. In addition to the overall low participation in both the control and test groups, there was a high attrition rate, as only about one-third of those who started the study (i.e., completed the pretest) also completed the posttest (35% control group; 37% test group). Low participation rates due to time constraints have been reported in a previous study on pediatric emergency medicine online modules for residents[13] and is similar to what we experienced in our series. The barriers to participation included voluntary participation, Tutor only being accessible through a desktop computer, laptop, or tablet (not smartphones), no dedicated time allotted during the clinical clerkship orientation (i.e., for pretest completion) or during the rotation (i.e., for modules and posttest completion). The lack of familiarity with the program when first engaging it may have also been a barrier to initial participation. To help circumvent this, a live demonstration was provided by a pediatric pathologist (JP) during orientation, but limited time prevented formal instruction with medical students logged into their individual profiles. Based on the final exit survey, 75% of medical students reported that they did not participate due to lack of time, which supported what we had anticipated and heard through anecdotal grievances. During the pediatric clerkships, medical students are currently required to complete several online modules in addition to their clinical responsibilities and studying for shelf examinations, making them less inclined to voluntarily participate in an additional activity, especially a voluntary activity without proven impact on their learning. Student self-suggestions during the survey to overcome the barriers for participation included: mandatory participation, incorporating it into the curriculum with dedicated time during the clerkship, and including it in the clerkship grade or offering extracredit. There was also an overall favorable response in the ease of use with the Tutor technology. Furthermore, while there was an overall low participation rate in the study (8%) for both groups, there was higher initial engagement (21%) with the technology in both groups during the pretest. The overall high attrition rate in completing the study could have been reduced with any of these student self-suggestions. Thus, taken together, these findings may indicate that engagement and failure to finish the study may have been more related to lack of dedicated time in the rotation rather than a lack of interest in the new technology. To address the low participation rate for the future, we recommend incorporating the online histopathology case-based modules into the pediatric clerkship curriculum with at least 15–20 min of dedicated time during orientation and at the end of the rotation devoted to pretest and posttest completion. We also recommend 1 h dedicated to proctored engagement, with the modules in which medical students can all access, review, and ask questions with a heightened awareness and familiarity to further engage with the modules on their own time as a structured study aid.

Our pilot study expands on similar studies that demonstrated the benefits of online modules and virtual microscopy for medical education.[7],[8],[12],[14] The advantages of online resources include convenience, accessibility, interactivity, and individualized learning. A meta-analysis showed that internet-based learning is associated with positive outcomes compared to no intervention and is as effective as traditional teaching methods.[14] Virtual microscopy has been used to teach histology, pathology, and cytology in medical education and pathology training.[15] In addition, virtual microscopy has been shown to be as effective as conventional microscopy for learning pathology.[12] A study conducted at UPSOM evaluated the Virtual Pathology Instructor, a patient simulation software using vpSim (UPSOM Laboratory for Educational Technology, Pittsburgh, PA, USA), to teach pathology to medical students during the hematology course in the 2nd-year medical student curriculum.[7] Medical students had significant improvement in test scores for two of the three cases, indicating that patient cases with virtual pathology images are an effective tool for teaching pathology in medical education.[7] One study that assessed multiple pathology teaching methods found that online modules integrated with virtual microscopy and clinical cases were very useful for learning and positively received by medical students.[8] However, neither study had a concurrent control group as in our study.

There are limited studies on the integration of virtual microscopy with online histopathology case-based modules to teach pathology to medical students during clinical clerkships. Furthermore, few studies included a control group with no educational intervention, and none focusing on pediatric-based disease concepts, which are heavily nonneoplastic in nature. Our study contributes additional data to support the use of online modules and virtual microscopy as a supplemental pathology resource in the clinical clerkship curriculum. We also provide a case-based question example for future educational material to the online community [Supplemental Text [Additional file 1]].

In summary, Tutor is an educational tool for teaching pathology by incorporating novel, tailor-made questions, macroscopic images, and WSI (“virtual microscopy”) into interactive, self-directed, and structured e-modules. In addition, integrating clinical vignettes and laboratory data into these modules may help reinforce medical student's understanding of the role of “pathologists” in patient care extending from anatomic to clinical pathology. Tutor and similar programs can be used to efficiently teach pathology to medical students as minimal time is dedicated to pathology education, especially during clinical clerkships; however, a small time investment allotted to medical students for the proctored exploration of these new “study tools” as a group would be beneficial in reducing barriers to engagement.


   Conclusions Top


Using innovative online technology with WSI and tailored pediatric pathology modules (i.e., Philips Tutor), we have demonstrated that medical students may improve their knowledge of pediatric pathology through the use of control and test groups. Medical students responded favorably to the new technology in both content and ease of use and also recommended strategies that would help boost their engagement with the technology in the future. Therefore, incorporating histopathology case-based virtual microscopy e-modules into the medical school curriculum as a supplemental resource may enhance medical students' knowledge of important, selected diseases. However, engagement requires both a supportive environment, with dedicated time investment and established expectations, set forth by educational leadership, in order for students to actively engage in such technology. Our pediatric pathology model in the pediatric clinical rotations may serve as a pilot for its use in other clinical clerkships, medical school courses, and graduate medical education.

Acknowledgment

The authors would like to thank Ms. Marlynn Haigh (pediatric medical education coordinator), the Pediatric Inpatient Medicine Clerkship Directors, and the Combined Ambulatory Medicine and Pediatric Clerkship Directors, including Dr. Aimee Biller, at the UPMC Children's Hospital of Pittsburgh and University of Pittsburgh School of Medicine, for their organizational assistance in the study.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

 
   References Top

1.
Khasawneh R, Simonsen K, Snowden J, Higgins J, Beck G. The effectiveness of e-learning in pediatric medical student education. Med Educ Online 2016;21:29516.  Back to cited text no. 1
    
2.
Mounsey A, Reid A. A randomized controlled trial of two different types of web-based instructional methods: One with case-based scenarios and one without. Med Teach 2012;34:e654-8.  Back to cited text no. 2
    
3.
Craddock MF, Blondin HM, Youssef MJ, Tollefson MM, Hill LF, Hanson JL, et al. Online education improves pediatric residents' understanding of atopic dermatitis. Pediatr Dermatol 2018;35:64-9.  Back to cited text no. 3
    
4.
Chang TP, Pham PK, Sobolewski B, Doughty CB, Jamal N, Kwan KY, et al. Pediatric emergency medicine asynchronous e-learning: A multicenter randomized controlled solomon four-group study. Acad Emerg Med 2014;21:912-9.  Back to cited text no. 4
    
5.
Guarner J, Burd EM, Kraft CS, Armstrong WS, Lenorr K, Spicer JO, et al. Evaluation of an online program to teach microbiology to internal medicine residents. J Clin Microbiol 2015;53:278-81.  Back to cited text no. 5
    
6.
Morgulis Y, Kumar RK, Lindeman R, Velan GM. Impact on learning of an e-learning module on leukaemia: A randomised controlled trial. BMC Med Educ 2012;12:36.  Back to cited text no. 6
    
7.
Craig FE, McGee JB, Mahoney JF, Roth CG. The virtual pathology instructor: A medical student teaching tool developed using patient simulator software. Hum Pathol 2014;45:1985-94.  Back to cited text no. 7
    
8.
Gopalan V, Kasem K, Pillai S, Olveda D, Ariana A, Leung M, et al. Evaluation of multidisciplinary strategies and traditional approaches in teaching pathology in medical students. Pathol Int 2018;68:459-466.  Back to cited text no. 8
    
9.
Curriculum Competencies and Objectives; 2019. Available from: https://www.comsep.org/curriculum-competencies-and-objectives/. [Last accessed on 2019 Jan 21].  Back to cited text no. 9
    
10.
Aquifer Pediatrics; 2019. Available from: https://www.aquifer.org/courses/aquifer-pediatrics/. [Last accessed on 2019 Jan 21].  Back to cited text no. 10
    
11.
Likert R. A technique for the measurement of attitudes. Arch Psychol 1932;22:55.  Back to cited text no. 11
    
12.
Ordi O, Bombí JA, Martínez A, Ramírez J, Alòs L, Saco A, et al. Virtual microscopy in the undergraduate teaching of pathology. J Pathol Inform 2015;6:1.  Back to cited text no. 12
[PUBMED]  [Full text]  
13.
Little-Wienert K, Hsu D, Torrey S, Lemke D, Patel B, Turner T, et al. Pediatric emergency medicine online curriculum improves resident knowledge scores, but will they use it? Pediatr Emerg Care 2017;33:713-7.  Back to cited text no. 13
    
14.
Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM, et al. Internet-based learning in the health professions: A meta-analysis. JAMA 2008;300:1181-96.  Back to cited text no. 14
    
15.
Pantanowitz L, Szymas J, Yagi Y, Wilbur D. Whole slide imaging for educational purposes. J Pathol Inform 2012;3:46.  Back to cited text no. 15
[PUBMED]  [Full text]  


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]



 

 
Top
  

    

 
  Search
 
   Browse articles
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    Abstract
   Introduction
   Methods
   Results
   Discussion
   Conclusions
   Supplemental Text
   Question Stem
    Answers along wi...
    Detailed Overvie...
    Sample Module De...
    References
    Article Figures
    Article Tables

 Article Access Statistics
    Viewed643    
    Printed39    
    Emailed0    
    PDF Downloaded146    
    Comments [Add]    

Recommend this journal