Journal of Pathology Informatics Journal of Pathology Informatics
Contact us | Home | Login   |  Users Online: 654  Print this pageEmail this pageSmall font sizeDefault font sizeIncrease font size 




 
Table of Contents    
ORIGINAL ARTICLE
J Pathol Inform 2017,  8:10

RecutClub.com: An open source, whole slide image-based pathology education system


Department of Pathology and Genomic Medicine, Houston Methodist Hospital, Weill Cornell Medical College of Cornell University, Houston, TX 77030, USA

Date of Submission23-Sep-2016
Date of Acceptance18-Jan-2017
Date of Web Publication10-Mar-2017

Correspondence Address:
S Wesley Long
Department of Pathology and Genomic Medicine, Houston Methodist Hospital, 6565 Fannin Street, R6.116, Houston, TX 77030
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/jpi.jpi_72_16

Rights and Permissions
   Abstract 

Background: Our institution's pathology unknown conferences provide educational cases for our residents. However, the cases have not been previously available digitally, have not been collated for postconference review, and were not accessible to a wider audience. Our objective was to create an inexpensive whole slide image (WSI) education suite to address these limitations and improve the education of pathology trainees. Materials and Methods: We surveyed residents regarding their preference between four unique WSI systems. We then scanned weekly unknown conference cases and study set cases and uploaded them to our custom built WSI viewer located at RecutClub.com. We measured site utilization and conference participation. Results: Residents preferred our OpenLayers WSI implementation to Ventana Virtuoso, Google Maps API, and OpenSlide. Over 16 months, we uploaded 1366 cases from 77 conferences and ten study sets, occupying 793.5 GB of cloud storage. Based on resident evaluations, the interface was easy to use and demonstrated minimal latency. Residents are able to review cases from home and from their mobile devices. Worldwide, 955 unique IP addresses from 52 countries have viewed cases in our site. Conclusions: We implemented a low-cost, publicly available repository of WSI slides for resident education. Our trainees are very satisfied with the freedom to preview either the glass slides or WSI and review the WSI postconference. Both local users and worldwide users actively and repeatedly view cases in our study set.

Keywords: Digital pathology, pathology education, whole slide image


How to cite this article:
Christensen PA, Lee NE, Thrall MJ, Powell SZ, Chevez-Barrios P, Long S W. RecutClub.com: An open source, whole slide image-based pathology education system. J Pathol Inform 2017;8:10

How to cite this URL:
Christensen PA, Lee NE, Thrall MJ, Powell SZ, Chevez-Barrios P, Long S W. RecutClub.com: An open source, whole slide image-based pathology education system. J Pathol Inform [serial online] 2017 [cited 2017 May 25];8:10. Available from: http://www.jpathinformatics.org/text.asp?2017/8/1/10/201918


   Introduction Top


A classic example of assessing pathology trainees' knowledge and skill is through oral interrogation at an “unknown” conference. This conference is comprised of a list of case histories and glass slides stored at a central location, which trainees preview to create a list of potential diagnoses. A moderator subsequently presents either static images of these slides or reviews them real time, followed by the trainee's differential and favored diagnosis for each case. Once the final diagnosis is revealed and the conference ended, the moderator leaves with their presentation and the glass slides, the trainees leave with their differentials, and the institution has lost the educational content of the discussion.

Current methodology for unknown conferences has no easy mechanism for anonymously collecting residents' differential diagnoses, no inherent mechanism for storing and collating cases presented, and creates a lecturing model with little preconference learner input. In addition, this model cannot assess the individual microscopic examination processes, such as what each unique user previewed, where they focused, and how long they took. The cases cannot be physically previewed outside of the hosting institution. Finally, the approach is not conducive to convenient postconference case review by individual trainees, thereby minimizing the chance of long-term retention. Our hypothesis is that by creating an inexpensive whole slide image (WSI) repository of educational cases, our residents would use it as a tool for preconference preview and postconference review.

Herein, we describe our experience iteratively building a low-cost system to address these unknown conference deficiencies.


   Subjects and Methods Top


This study was approved by our Institutional Review Board (IRB #Pro00013561).

For a 4-week trial period, we E-mailed a voluntary and anonymous Google Forms survey to residents requesting that they report their experience with four different WSI viewing systems. Each survey contained case histories, fields for differential diagnoses entry, and links to a different WSI viewing system (Ventana Virtuoso,[1] OpenSlide,[2] Google Maps viewer,[3] and OpenLayers [4] viewer) used for that week's conference. The survey concluded with questions regarding that week's WSI viewing system formulated as a five-point Likert scale questionnaire (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree) [Table 1].
Table 1: Survey questions used during the evaluation period

Click here to view


The Ventana Virtuoso viewer consists of a vendor installed server and web-based portal requiring login credentials to access the cases. For the OpenSlide viewer, we placed the Java Runtime Environment (JRE), OpenSlide application, and the WSI files on a shared network drive. We chose this approach to prevent each trainee from being required to install the JRE or copy the WSI files onto his or her workstation. Users accessed these resources over the network rather than copying them to their local computer. The Google Maps viewer was a custom-built viewer based on the Google Maps API, configured to access static pretiled JPEG images stored on a local intranet server. Apart from being completely self-contained, the OpenLayers viewer functioned exactly like the Google Maps viewer. For all viewers, trainees previewed the cases both at the hospital and from home while connected to the virtual private network (VPN).

After preliminary results demonstrated that the trainees preferred the OpenLayers viewer [Figure 1], we further developed it using available open source software [Table 2]. To easily support mobile devices, as well as eliminate the need to login through our institution's VPN, we migrated the system onto a cloud-based hosting service.
Figure 1: Mean user ratings for the virtual slide viewing systems. Maps web viewer combines the Google Maps and OpenLayers responses

Click here to view
Table 2: Services utilized during development

Click here to view


We scanned weekly unknown conference slides and slides from curated educational slide study sets. Due to feedback about the lack of nuclear detail at ×20, we predominately scanned cases at ×40. We built image conversion software to upload 1024 × 1024 pixel JPEG image tiles onto cloud storage. We stored case metadata in a database. The trainee was then able to interpret the WSI through the OpenLayers web interface [Figure 2]. The image conversion software can utilize TIFF, JPG, JP2, BIF (Ventana), SVS (Aperio), SCN (Leica), and NDPI (Hamamatsu)[5] and runs on a 64-bit version of Windows 7 with a Xeon E3-1241 v3 processor, 16 GB of RAM, and a 1 TB hard drive.
Figure 2: Workflow from glass slide to website

Click here to view


For the next 12 unknown conferences, we used LimeSurvey to record submitted differential diagnoses, the time it took to proceed through each case, the total time to complete the survey, and an optional question for the user training level (PGY1-4, fellow). We calculated statistics and P values using Microsoft Excel's built-in functions and two sample t- test.

To attract worldwide interest and share our educational resources with pathologists outside our institution, we shared the web address (http://www. RecutClub.com/) on a Facebook pathology interest group and on Twitter.


   Results Top


Initial whole slide image system evaluation results

A majority of residents surveyed during the 4-week pilot preferred the maps' style web viewers (Google Maps or OpenLayers) to Ventana Virtuoso or OpenSlide [Figure 1]. Maps-style web viewers earned the highest average survey score in ease of use (4.8 vs. 3.7 and 1.7) and latency (4.8 vs. 4.0 and 0.6). Free text comments touted their speed and ease of use: “Amazing with Google-Earth-like speed,” “This was very easy and impressive,” and “This version of digital slides is the best out of all options we have tried over the last few weeks.”

Preconference survey results

For each conference, we invited twenty residents and 16 fellows to submit a preconference differential diagnosis. Excluding clinical pathology only residents and fellows as well as occasional trainees on away rotations, we anticipated twenty responses per conference. There were 106 survey submissions from 16 conferences with a per-conference mean of seven (standard deviation [SD] = three submissions). First-year residents (PGY1) submitted 42 responses, 2nd through 4th years (PGY2–4) submitted 36 responses, fellows (PGY5 and above) submitted seven responses, and 21 of the received responses did not include a training level designation. First years were significantly more likely to submit a response than second through 4th years (P = 0.000013). Based on the time spent filling out the survey, 1st years spent 54 min on average preparing for each conference while 2nd through 4th year spent 37 min on average; however, the difference in time spent was not statistically significant (P = 0.12).

Technical results

We compared twenty scanned slides in both the proprietary BigTIFF (BIF) and JPEG 2000 (JP2) file formats and found the JP2 format to be 25% the size of a comparable BIF; we chose the vendor-neutral JP2 file format for its compact size. The JP2 image files range in size from 6 MB (skin biopsies with one level scanned at ×20) to 1.8 GB (large tissue sections scanned at ×40). Three pathologists assessed image quality at two JPEG compression settings and could not detect a noticeable difference between max (100) and high (70) quality; we used high-quality compression, which resulted in an 85% reduction in storage and bandwidth requirements.

Over the course of 470 days, we uploaded 1366 cases from 77 conferences and 10 study sets [Figure 3]. These cases included 1500 scanned slides (1471 H and E, 21 IHC, and 8 special stains) requiring 7,865,054 JPEG tiles (5243 tiles/slide) that occupy 793.5 GB of cloud storage (542 MB/slide).
Figure 3: Category distribution of scanned cases

Click here to view


The system was actively transmitting data to users viewing the WSI for 294,571 s (0.7% of the time). During these times of active transmission, the average network throughput was 325 KB/s (SD = 517 KB/s); the peak network throughput was 9.3 MB/s. The cloud CPU credit balance almost never dipped below 99%.

For the last two conferences (13 slides, 229 MB/JP2 file), on average, image conversion took 49 s and image upload took 185 s, resulting in a total processing time of 4 min/slide. The largest randomly selected image files took nearly 30 min to process and upload.

Website access and social media results

A total of 1914 unique IP addresses from 68 countries accessed the website from 247 unique browser-device combinations. Of these, 959 (50.1%) did not view any cases. These consisted of visitors to the homepage only (39%), web crawlers (33%), and other bots (28%).

A total of 955 unique IP addresses from 52 countries viewed cases in the website from 218 unique browser-device combinations [Figure 4]. Of those, 464 (48.6%) viewed three or more cases (average 21 cases; SD = 44 cases; range 3–580 cases). Sixteen IP addresses viewed over 100 cases. Twelve of these originated from our city (Houston) in the USA; three originated from other states in the USA; one originated from Europe. A total of 429 IP addresses (45%) viewed cases before the conference date while 635 (66%) reviewed cases after they were presented in conference. Users accessed the system at all times of the day, with trough times in the early morning central standard time [Figure 5].
Figure 4: Heatmap of all locations that have accessed cases in our repository (52 countries; 955 unique IP addresses; 218 unique browser-device combinations). IP address geolocation data acquired using http://www.ipinfo.io

Click here to view
Figure 5: Total number of site visitors by time of day (central standard time)

Click here to view


Over a period of 86 days, an average of 1 tweet per 5 days was posted highlighting a case in the RecutClub repository. On average, there were 529 impressions (times a user is served a tweet in timeline or search result) perpost. An average of 2% of Twitter users that were served a tweet clicked on the link directing them to the case.

Conference participation results

Fifty-six percent of the time, trainees submitted preconference survey responses on Sunday before Monday morning conference. We did not quantify overall satisfaction with the conference; however, residents anecdotally reported tremendous satisfaction with the WSI viewing system. In particular, they preferred the ability to review the cases from home rather than at the hospital and enjoyed the flexibility of previewing the cases from their mobile devices.


   Discussion Top


Whole slide image system selection discussion

The digital pathology association curates an extensive list of available WSI repositories [Table 3],[6] which at the time of this writing lists 24 websites that use 11 different WSI viewing systems. However, some challenges exist in utilizing these resources. First, 3/24 (13%) of the listings linked to inaccessible websites due to either broken links or internal server errors. Second, of the remaining 21 sites, only 8/21 (38%) were mobile – ready, meaning they could provide content easily to iOS and Android mobile devices and desktops. In one survey of medical residents, 99% of participants were mobile phone users.[7] Given the rapid advances in screen resolution and quality in smartphone technology as well as their ubiquitous use by trainees, mobile-ready WSI resources are not a luxury but a necessity.[8] Third, only one website (Pathobin [9]) supported public upload capabilities, but even this functionality is limited.
Table 3: Comparison of whole.slide.viewing systems listed by digital pathology association

Click here to view


Pathobin strives to make available and host WSI obtained through low-resource image acquisition by semiautomated stitching of manually captured still images. This is an elegant solution for small biopsies, but the process of manually capturing still images using a microscope mounted camera is time-consuming and resource-intensive for large tissue sections. In addition, a limit currently exists on individual case upload size (100 MB). Unlimited number of cases and file sizes can be uploaded if a hard drive with images and associated case data is sent to the site administrator, Shane Battye (personal correspondence, February 17, 2015).

The Google Maps API, which New York University uses for accessing their WSI, stitches tiled images to make a snappy and seamless user interface for navigating through the WSI.[10] Google Maps is compatible with low- and high-performance computers and works very well on a variety of mobile platforms. In addition, the coding necessary to develop and deploy a Google Maps-driven WSI interface is relatively simple; many online communities even offer assistance for those with very limited computer science backgrounds and the source code is freely distributable and downloadable. One key drawback of this service is self-sufficiency. The Google Maps API does not reside on the WSI image web server. Google centrally provides and monitors these services. Despite the unlikeliness of Google suddenly shutting down all of its services, any website hosted by their API is dependent on Google, and they have discontinued services in the past.[11]

During our initial 4-week WSI system evaluation period, we reviewed four potential platforms to function as our WSI viewer: Ventana, OpenSlide, Google Maps, and OpenLayers. Ventana provides both an application version (Image Viewer) and a web-based system (Virtuoso) to view WSI. Image Viewer would require an installation to every computer that would need access. Virtuoso was designed as a surgical sign-out assistant rather than an education tool and is only available via our institution's VPN. In addition, we found that both viewers were not customizable, could not expose granular data of viewing patterns, and do not support mobile devices. The OpenSlide viewer also does not support mobile devices and did not perform well in our survey. We suspect this is partly because it was installed on a shared network drive and all images were loaded from the network, which resulted in marked delays in image refresh time after pan or zoom actions. We chose not to use Google Maps because our system would be reliant on the constant availability of Google's servers. In addition, the Google Maps API feature set is more limited when compared with OpenLayers.

We decided on and configured OpenLayers, an open source package of web-based mapping tools developed for cartographers as the interface for viewing WSI. Previously, Brochhausen et al.[12] developed a similar WSI viewing system based on a prior version of OpenLayers (2.12). OpenLayers 3 natively supports a wide range of interactive features and can be downloaded and installed on a web server with no external dependencies. Examples of desired features supported by OpenLayers include support for multiple layers of images at the same zoom level, heat maps, image rotation, screen capture, annotations, and compatibility with mobile devices. Any modern browser, including mobile devices, can use the OpenLayers viewer without additional plug-ins.

Despite the viewer utilized, many users commented on the poor nuclear detail rendered by ×20 scans. Efforts to increase nuclear detail by scanning slides at ×40 were anecdotally successful; later surveys did not investigate the effects of this scanning change. Although recent literature suggests digital pathology is equivalent to glass slide microscopy in terms of final diagnoses,[13],[14],[15] a majority of our residents (91%) did not believe virtual microscopy was equivalent to light microscopy in terms of identifying key diagnostic features.

Technical discussion

Using cloud computing and storage, along with open source code, we created a low-cost WSI educational suite. We chose to archive our WSI images in the JP2 file format because of its compact file size and it is vendor neutral. We discovered that JPEG compression quality of 70 is acceptable for educational purposes, achieving an 85% file size reduction compared to maximum image quality compression. The storage (542 MB/slide on average) and bandwidth (325 KB/s on average) requirements to operate our system are low.

Using static tiled images instead of dynamically accessed pyramidal images has advantages. Static files can be stored on inexpensive cloud storage separate from the server rather than on server-mounted drives. Tile load times are low and zoom transitions are smooth because all image processing are performed during case installation. Our attempts at supporting dynamically generated tiles from JP2 WSI files resulted in inferior performance with increased image load times. Image compression during processing can also offer significantly reduced storage space and network bandwidth requirements without appreciable decrease in image quality.

Adoption of survey system and concerns about user anonymity and privacy

We noticed limited use of our survey system and the number of users who provided a differential diagnosis was lower than expected. On average, only 35% of trainees submitted a survey per conference, with the majority of these coming during the first 2 months of implementation. In the last 8 conferences, only 4–5 users out of the group offered differential diagnoses. Our 1st-year residents consistently submitted the majority of responses, which may reflect a willingness to be incorrect in front of their peers.

The surveys were completely anonymous, none of the faculty had access to the raw data, and the web administrators could not predict a user's identity based on IP address or other workstation characteristics. Despite these facts, many trainees were hesitant to electronically tabulate differential diagnoses. We promptly aggregated survey responses and forwarded them to the presenter; however, submissions were often too late for the presenter to tailor their conference to the differential provided. Although our system was founded on voluntary participation, one could request or require trainees to return differentials earlier to allow presenters more time to prepare a response or otherwise increase use of the system for education.

Visualizing the microscopic examination

Previous studies of WSI demonstrated that WSI is as effective as and perceived as more efficient than learning from glass slides and textbooks.[16] In addition, these tools can easily reach a large number of practitioners.[17] There are now multiple examples of virtual microscopy in medical and veterinary schools and residency programs across the country.[18],[19],[20],[21] Virtual microscopy is also used for continuing medical education, licensure/board examinations, and teaching.[22],[23] In addition to text and static images, WSI offers enhanced value to the learner.[24],[25] Evaluation of eye movements among residents in radiology as well as in pathology have offered preliminary data to suggest that users with higher training are able to hone into diagnostically relevant areas of an image; these data can be used to perfect a trainee's search strategy and image analysis skills.[26],[27]

In our system, the server records image tiles requested by the OpenLayers interface as a user examines the WSI, which follows a similar approach used by Walkowski et al.[28] The web server logs the IP address, device, browser, timestamp, case, and x-y-zoom coordinates of each tile requested during a viewing experience. Viewing patterns can be studied by creating a visualization (such as a heat map), of which tiles the trainee looked at, how long she looked at a certain area, and how close she zoomed in.

We are able to generate individual and group viewing patterns from large numbers of users through a simple and low-cost web interface, without using expensive, finely tuned, and calibrated equipment [Figure 6]. While investigators are just beginning to study pathologists' WSI viewing behavior and draw conclusions from a visualized microscopic examination,[29] we were able to notice three general viewing patterns in our trainees. Some residents reviewed the tissue broadly at low power, then zoomed in once on a region of interest (B, D, E). Others panned larger portions of tissue at medium power, then zoomed in several times (A, C, G, K, L). A third group of users examined significant portions of tissue at high power (F, H, I, J). Some users focused in the middle of the tissue while others were more careful to look at the capsule of this thyroid lesion (cribriform-morular variant of papillary thyroid carcinoma).
Figure 6: Viewing patterns from 12 individual IP addresses. Red indicates higher magnification viewing; green-yellow indicates lower magnification viewing

Click here to view


One limitation of the heatmaps is that time and field-of-view order are not represented visually, and these dimensions may provide further clues into the skill of the trainee. Another limitation of our data is that the trainee level was not associated with the viewing pattern, so we are unable to investigate the visualization differences between novice and expert histologic examinations. A subsequent study could require trainees to report their PGY level to overcome this limitation. More work is needed to investigate whether heatmaps and other novel viewing pattern visualizations may provide clues to improve education in surgical pathology.


   Conclusions Top


During the past 16 months, we have created a low-cost repository of WSI slides for unknown conferences and resident education. Coupled with the repository is an anonymous survey system to evaluate preconference differential diagnoses, designed to nurture an interactive environment during didactic sessions. This approach allows presenters to see how many individuals previewed the slides and address differential diagnoses provided by the previewing residents and practicing pathologists. Our residents are very satisfied with the freedom to preview either the glass slides or the WSIs. We have enabled convenient postconference case review for self-study.

Effective learning requires repeated review. Highlighted by Dunlosky et al.,[30] one common learning technique is “Practice testing” in which flashcards, multiple choice questions, and other forms are used to improve comprehension. In their review, spaced practice was superior to massed practice also known as cramming. Evidence from Cepeda et al.[31] showed that memory performance is best when the lag between sessions is 10%–20% of the desired retention interval. Reviewing information at spaced intervals is critical to knowledge retention. RecutClub serves as an archive of educational cases that is accessible from any computer or mobile device with an internet connection, and the majority of the cases accessed on our system are reviewed after the case was presented in conference.

As demonstrated in [Figure 4], users from all over the world as well as local users are actively and repeatedly viewing cases in our study set. We host the system at http://www. RecutClub.com and encourage the use of the slides hosted on this public resource.

Future work

Along with developing novel ways to visualize a WSI examination, future work includes expanding the case selection, creating review modules, and adding image annotations. Our system solely utilizes WSI, but static gross and radiologic images are crucial for establishing and confirming differential diagnoses, and we are expanding to include these in the educational suite.

Acknowledgment

We would like to thank Dr. Ramon Sanchez and Dr. Brandon Driver for contributing to the operating costs of RecutClub.com

Financial support and sponsorship

This work was supported by the Houston Methodist Hospital Department of Pathology and Genomic Medicine. Cloud services funded by Paul Christensen, MD, and Nathan Lee, MD, with contributions from R. Sanchez, MD, of DermoPath Laboratory and Brandon Driver, MD, Houston Methodist Hospital.

Conflicts of interest

There are no conflicts of interest.

 
   References Top

1.
Ventana Medical Systems I. Ventana Virtuoso. Available from: http://www.ventana.com/product/page?view=virtuoso. [Last accessed on 2016 Jun 27].  Back to cited text no. 1
    
2.
OpenSlide. Available from: http://www.openslide.org/. [Last accessed on 2016 Jun 27].  Back to cited text no. 2
    
3.
Google. Google Maps JavaScript API. Available from: https://www.developers.google.com/maps/documentation/javascript/. [Last accessed on 2016 Jun 27].  Back to cited text no. 3
    
4.
OpenLayers. OpenLayers 3. Available from: http://www.openlayers.org/. [Last accessed on 2016 Jun 27].  Back to cited text no. 4
    
5.
Goode A, Gilbert B, Harkes J, Jukic D, Satyanarayanan M. OpenSlide: A vendor-neutral software foundation for digital pathology. J Pathol Inform 2013;4:27.  Back to cited text no. 5
[PUBMED]  [Full text]  
6.
Association DP. Whole Slide Imaging Repository; 2016. Available from: https://www.digitalpathologyassociation.org/whole-slide-imaging-repository. [Last accessed on 2016 Jul 11].  Back to cited text no. 6
    
7.
Jamal A, Temsah MH, Khan SA, Al-Eyadhy A, Koppel C, Chiang MF. Mobile phone use among medical residents: A cross-sectional multicenter survey in Saudi Arabia. JMIR Mhealth Uhealth 2016;4:e61.  Back to cited text no. 7
    
8.
Google. Mobile Friendly Websites. Available from: https://www.developers.google.com/webmasters/mobile-sites/. [Last accessed on 2016 Jun 27].  Back to cited text no. 8
    
9.
Pathobin. Pathobin 2013. Available from: https://www.pathobin.com/. [Last accessed on 2016 Jun 27].  Back to cited text no. 9
    
10.
Triola MM, Holloway WJ. Enhanced virtual microscopy for collaborative education. BMC Med Educ 2011;11:4.  Back to cited text no. 10
    
11.
Google. Official Google Blog; 2013. Available from: https://www.googleblog.blogspot.com/2013/03/a-second-spring-of-cleaning.html. [Last accessed on 2016 Jun 27].  Back to cited text no. 11
    
12.
Brochhausen C, Winther HB, Hundt C, Schmitt VH, Schömer E, Kirkpatrick CJ. A virtual microscope for academic medical education: The pate project. Interact J Med Res 2015;4:e11.  Back to cited text no. 12
    
13.
Snead DR, Tsang YW, Meskiri A, Kimani PK, Crossman R, Rajpoot NM, et al. Validation of digital pathology imaging for primary histopathological diagnosis. Histopathology 2016;68:1063-72.  Back to cited text no. 13
    
14.
Saco A, Ramírez J, Rakislova N, Mira A, Ordi J. Validation of whole-slide imaging for histolopathogical diagnosis: Current state. Pathobiology 2016;83:89-98.  Back to cited text no. 14
    
15.
Goacher E, Randell R, Williams B, Treanor D. The diagnostic concordance of whole slide imaging and light microscopy: A systematic review. Arch Pathol Lab Med 2017;141:151-61.  Back to cited text no. 15
    
16.
Van Es SL, Kumar RK, Pryor WM, Salisbury EL, Velan GM. Cytopathology whole slide images and adaptive tutorials for postgraduate pathology trainees: A randomized crossover trial. Hum Pathol 2015;46:1297-305.  Back to cited text no. 16
    
17.
Hang JF, Liang WY, Hsu CY, Lai CR. Integrating a web-based whole-slide imaging system and online questionnaires in a national cytopathology peer comparison educational program in Taiwan. Acta Cytol 2015;59:278-83.  Back to cited text no. 17
    
18.
Foster K. Medical education in the digital age: Digital whole slide imaging as an e-learning tool. J Pathol Inform 2010;1. pii: 14.  Back to cited text no. 18
    
19.
Leifer Z. The use of virtual microscopy and a wiki in pathology education: Tracking student use, involvement, and response. J Pathol Inform 2015;6:30.  Back to cited text no. 19
[PUBMED]  [Full text]  
20.
Saco A, Bombi JA, Garcia A, Ramírez J, Ordi J. Current status of whole-slide imaging in education. Pathobiology 2016;83:79-88.  Back to cited text no. 20
    
21.
Brown PJ, Fews D, Bell NJ. Teaching veterinary histopathology: A comparison of microscopy and digital slides. J Vet Med Educ 2016;43:13-20.  Back to cited text no. 21
    
22.
Onega T, Reisch LM, Frederick PD, Geller BM, Nelson HD, Lott JP, et al. Use of digital whole slide imaging in dermatopathology. J Digit Imaging 2016;29:243-53.  Back to cited text no. 22
    
23.
Mukherjee MS, Donnelly AD, DeAgano VJ, Lyden ER, Radio SJ. Utilization of virtual microscopy in cytotechnology educational programs in the United States. J Pathol Inform 2016;7:8.  Back to cited text no. 23
[PUBMED]  [Full text]  
24.
Yin F, Han G, Bui MM, Gibbs J, Martin I, Sundharkrishnan L, et al. Educational value of digital whole slides accompanying published online pathology journal articles: A multi-institutional study. Arch Pathol Lab Med 2016;140:694-7.  Back to cited text no. 24
    
25.
Reder NP, Glasser D, Dintzis SM, Rendi MH, Garcia RL, Henriksen JC, et al. NDER: A novel web application using annotated whole slide images for rapid improvements in human pattern recognition. J Pathol Inform 2016;7:31.  Back to cited text no. 25
[PUBMED]  [Full text]  
26.
Brunyé TT, Carney PA, Allison KH, Shapiro LG, Weaver DL, Elmore JG. Eye movements as an index of pathologist visual expertise: A pilot study. PLoS One 2014;9:e103447.  Back to cited text no. 26
    
27.
Kok EM, de Bruin AB, Robben SG, van Merriënboer JJ. Looking in the same manner but seeing it differently: Bottom-up and expertise effects in radiology. Appl Cogn Psychol 2012;26:854-62.  Back to cited text no. 27
    
28.
Walkowski S, Lundin M, Szymas J, Lundin J. Exploring viewing behavior data from whole slide images to predict correctness of students' answers during practical exams in oral pathology. J Pathol Inform 2015;6:28.  Back to cited text no. 28
[PUBMED]  [Full text]  
29.
Mercan E, Aksoy S, Shapiro LG, Weaver DL, Brunyé TT, Elmore JG. Localization of diagnostically relevant regions of interest in whole slide images: A comparative study. J Digit Imaging 2016;29:496-506.  Back to cited text no. 29
    
30.
Dunlosky J, Rawson KA, Marsh EJ, Nathan MJ, Willingham DT. Improving students' learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychol Sci Public Interest 2013;14:4-58.  Back to cited text no. 30
    
31.
Cepeda NJ, Vul E, Rohrer D, Wixted JT, Pashler H. Spacing effects in learning: A temporal ridgeline of optimal retention. Psychol Sci 2008;19:1095-102.  Back to cited text no. 31
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5], [Figure 6]
 
 
    Tables

  [Table 1], [Table 2], [Table 3]



 

 
Top
  

    

 
  Search
 
   Browse articles
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    Abstract
   Introduction
   Subjects and Methods
   Results
   Discussion
   Conclusions
    References
    Article Figures
    Article Tables

 Article Access Statistics
    Viewed411    
    Printed2    
    Emailed0    
    PDF Downloaded74    
    Comments [Add]    

Recommend this journal