Journal of Pathology Informatics Journal of Pathology Informatics
Contact us | Home | Login   |  Users Online: 1280  Print this pageEmail this pageSmall font sizeDefault font sizeIncrease font size 




 
Table of Contents    
RESEARCH ARTICLE
J Pathol Inform 2014,  5:34

Web-based pathology practice examination usage


Department of Biomedical Sciences, Mercer University School of Medicine, Savannah, GA, USA

Date of Submission27-May-2014
Date of Acceptance24-Jun-2014
Date of Web Publication30-Sep-2014

Correspondence Address:
Edward C Klatt
Department of Biomedical Sciences, Mercer University School of Medicine, Savannah, GA
USA
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/2153-3539.141987

Rights and Permissions
   Abstract 

Context: General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Subjects and Methods: Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. Results: The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Conclusions: Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology education.

Keywords: Computer assisted learning, e-learning, examination, internet, medical education, pathology, web site


How to cite this article:
Klatt EC. Web-based pathology practice examination usage. J Pathol Inform 2014;5:34

How to cite this URL:
Klatt EC. Web-based pathology practice examination usage. J Pathol Inform [serial online] 2014 [cited 2019 Nov 12];5:34. Available from: http://www.jpathinformatics.org/text.asp?2014/5/1/34/141987


   Introduction Top


The World Wide Web has become an excellent resource for distribution of educational materials in an electronic format for independent study with computer-aided instruction. Students in the health sciences can utilize their own electronic devices with web browsers to access a growing number of web-based educational content. [1] Though web-based educational materials may be popular with students as indicated on evaluations, are online resources used in the manner for which they were intended, and do they yield results that suggest students are achieving mastery of subject matter they have studied? [2]

Students' evaluations and feedback sessions in the preclinical years at the author's medical school have called for greater availability of practice examinations to prepare for "real" examinations, including both those in the pathology discipline taught at this school as well as the United States Medical Licensing Examination (USMLE) Step I examination. In particular, students have shown the greatest interest in access to practice examinations that incorporate imaging. This study was performed to analyze the usage and results of web-based pathology practice examinations that were developed to assist students using formative assessments of their knowledge. These practice examinations were developed to provide a freely available resource for review and self-assessment in major subject areas of pathology, but not to be so extensive or comprehensive that a large amount of student time would be required to complete them.


   Subjects and methods Top


A web site with public internet access to a site entitled web path was developed by the author, starting in 1994, with no direct funding throughout the site's history. The home page is shown in [Figure 1]. This site provides content for pathology education, primarily using images with text descriptions arranged by subject, along with examination questions. The site began receiving an average million web page "hits" per month at its inception, and access continued to increase in the 1990's until average monthly usage ranged from 6.5 to 8.5 million hits/month, and that has remained stable over the past 15 years. Usage paralleled the academic calendar of Northern Hemisphere medical schools, with more "hits" in Fall and Spring. Web site analysis revealed that over 95% of accesses originated from dynamic internet protocol (IP) addresses of commercial internet service providers, so it is impossible to assess the location of users. On average, over half of users gained access to web path via referral from internet search engines. The menu [Figure 2] for the examinations is accessed at the following URL: http://library.med.utah.edu/WebPath/EXAM/EXAMIDX.html The login window for the timed examinations is shown in [Figure 3].
Figure 1: Web path home page, with the link to the examinations highlighted by the red arrow

Click here to view
Figure 2: Examination web page main menu, with links to the nontimed general pathology and organ system pathology banks, and links to the timed review quizzes

Click here to view
Figure 3: Login popup window showing the data fields collected anonymously, before the start of each timed examination

Click here to view


The author developed examination questions for the web site that were representative of content for chapters in the Robbins and Cotran Pathologic Basis of Disease textbook that followed the USMLE content outline. [3] All examination questions were developed in USMLE style, and had 5 foils/question. [4] These patient vignette style questions integrated anatomical pathology with laboratory and physical examination findings [Figure 4] and [Figure 5]. The examination templates were developed using JavaScript for client-side HTML documents accessible via a web browser.

A bank of 431 questions in 9 topic areas for general pathology emphasizing mechanisms of disease, and a bank of 509 questions in 11 topic areas covering organ system pathology, was placed onto the web path site for user-chosen access in a nontimed format, as shown in [Table 1]. These questions provided feedback for each answer chosen by the user, along with a continuously compiled examination score using JavaScript with cookies. However, the scores were generated client side on the user's computer and not stored on the web server. The layout of this untimed format is shown in [Figure 4].
Figure 4: Sample nontimed examination format from the hematopathology examination, with the question stem in the lower frame, the feedback for the answer in the upper frame, and the cumulative score in the right frame

Click here to view
Figure 5: Sample timed examination format from the pulmonary subject examination. The stem of this question provides clues to the diagnosis of restrictive lung disease, and the image shows the gross pathology of honeycomb lung, so foil E is the best choice

Click here to view
Table 1: Examination content


Click here to view


The author developed a smaller bank of timed examinations to mimic a real testing environment. There were 17 short subject examinations of 10 questions with images. Each of the four multi subject review examinations had 30 questions, including 6 questions with images. The topic areas for the timed examinations in order of their display on the web pages are shown in [Table 1].

After clicking on a menu item to open a timed examination, a popup window appeared on the user's screen, shown in [Figure 3], with fields for login that recorded via radio buttons choices for the health science field of study for the user, along with free text field for voluntary, self-reported typed entry of the higher educational affiliation by the user. No E-mail address, IP address, or other identifier was collected, so the data remained anonymous and Institutional Review Board exempt. After submitting this form, the examination appeared on screen and began with timer activated. Users had 15 min to complete each 10questions subject examination. Users had 30 min to complete each of the 30 questions review examinations. A sample timed format question is shown in [Figure 5].

Following user submission and exit of an examination, scoring occurred using a server-side common gateway interface program using a Perl 5 program script. Upon completion of the exam, the web server received the information using the HTML form postmethod and then passed the data to the Perl script. The captured data elements written in comma-delimited sequence to an ASCII text file included: User's field of study, user's academic affiliation, total score, and chosen answer for each question. After a user submitted an examination for scoring, or the exam timed out, only those questions marked were scored. The user was then automatically directed to a web page that provided the overall percentage score for correct answers for the examination taken, along with a key of correct answers for that examination. As in a real examination, no feedback was provided regarding the answers to the questions in the timed format.

The Perl script generated a cumulative log file that recorded the individual logins to each examination for each line of the text file. The log files for each examination accumulated following migration of the web site to a new web server and updating with revision of questions in June 2006, until the log files were downloaded in January 2014. During that period, the timed questions were not modified. The ASCII text formatted log files were downloaded to the author's computer and loaded into Microsoft Excel (2011) spreadsheet software (Microsoft Corporation, Redmond, Washington) for each examination. The log files of anonymous user data were analyzed for total number of logins, number of logins with all questions completed, scores by question and health science field of study for each of the 21 examinations.


   Results Top


The 4 general review examinations [Table 2] had 31,639 accesses with completion of all questions, for an overall completion rate of 53.7%, and average score of 75.2% with standard deviation of 3.2. Review examination 3 with the average farthest from the mean was not a significant outlier (P > 0.05) by Grubbs' test to detect an outlier. In top to bottom web page menu order, review examination usage was 43.9%, 23.7%, 17.5%, and 14.9% of all accessions [Table 2].
Table 2: General review examination usage


Click here to view


The 17 subject examinations [Table 3] had 103,028 completions, with overall completion rate 72.6%, and average score 74.0% with standard deviation of 4.2. The pulmonary subject examination with the average farthest from the mean was not a significant outlier (P > 0.05) by Grubbs' test to detect an outlier. The first three menu items on the subject examination web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three menu items accounted for no more than 2% each [Table 3].
Table 3: Subject specific examination usage


Click here to view


Analysis of scores by user's field of study [Table 4] showed the highest overall scores were obtained by MD and DO users, 75.3% for the four general review examinations and 74.7% for the 17 subject examinations. Users in other health sciences had overall lower averages, ranging from 54.6% to 75.4% for the general review examinations, and from 50.9% to 73.7% for the subject examinations. None of the field of study categories was a significant outlier (P > 0.05) by Grubbs' test to detect an outlier.

For the four general review examinations, a score of 100% was achieved by 7% of all users, 21% of users achieved a score ≥90%, and 95% of users achieved ≥50% [Table 4]. For the 17 subject examinations, a score of 100% was achieved by 20% of users overall, 37% of users achieved a score ≥90%, and 90% achieved a score ≥50% [Table 5].
Table 4: Examination scores by user's field of study


Click here to view
Table 5: Review and subject examination scores by level of achievement


Click here to view



   Discussion Top


Web-based curricular tools have become popular for teaching students since the early 1990's. The value of web-based formative examinations (giving a score) has been shown to positively influence student performance in curricular assessments that count for a grade. [5],[6] Online assessments assist in curricular support for students at multiple institutional venues. [7] Web-based instruction can also promote approaches to deep learning (seeking to understand underlying concepts) that are more effective for students than surface learning (focused on passing the next examination). [8] Web-based examinations can provide resources to support distance learning, education beyond a single institution, and share expertise on a national scale. [9],[10]

The web path site was designed to provide a variety of pathologic material for review by health science students, but it cannot replace a standard curriculum. It serves as an example of what can be done to supplement pathology education with no direct funding for development and no direct cost to users. The total time required for completion of the web path web site question banks, estimating 1-1.5 min/question, is up to 23 h for untimed questions and up to 6 h for the timed questions. Time available is a limitation, both for web site authors and for users. There are now numerous question banks available on the internet, both free and through subscription. Novice students are challenged to select material that may be of value to their education. Usefulness of educational web sites can be evaluated with a published rating method. [11]

The web page menu for this study that leads to the login screen for the examinations is located four clicks away, including a popup login window [Figure 3], from the main menu [Figure 1] of the web path web site, tending to discourage casual surfing. There is no way from log analysis to determine how many persons with minimal or no prior education in pathology logged in and took the exams, but of total users, only 4.3% for the review examinations, and only 6.1% for the subject examinations, picked "other" as a field of study category, and their scores were well within the range of other users who indicated a specific field of study. However, the login did not distinguish students from persons at higher levels of training, such as residents, or practice professionals.

The value of anonymous data for research, such as the findings presented here, comes from the large amount of aggregate data that can be collected from users anywhere on earth and over a long period of time and at no cost to users. The major limitation is that the data cannot be verified for individual users by location, course of study, scores obtained, or conditions of usage. [12] Educational research studies recruiting student volunteers for more rigorous conditions of assessment and analysis generally obtain smaller subject numbers. National organizations providing qualifying examinations for certification can generate extensive statistical analyses, but generally do not publish their findings, results are released only to users and their educational institutions, and the examinations are costly to users. In this study, limiting the data analysis to users completing the examinations served to increase validity of results. Reliability was increased through aggregation of data over 9 years with an unchanging set of 290 questions.

Does posting web-based educational tools insure usage to justify the time needed to develop them? In this study, the completion rates indicated that the shorter the exercise, the more likely the user would complete it. Even for these short exams, the completion rate was only 72.6% overall. The general review exams were closer to mimicking a "real" timed exam, but completion rates were even lower, 53.7% overall, or little more than half the accessions.

What may have influenced the completion rates for these examinations? Users without sufficient preparation may have realized the difficulty factor and abandoned an examination. Users may have been accessing the examinations only to assess their potential content and possible educational value, planning to return later. Though completion rates were below 100% in this study, the scores achieved on these web-based examinations suggest that some users completing the examinations had sufficient preparation to make use of the examinations to support their pathology education. A high score ≥90% was achieved by 21% of users of review examinations and 37% of users for the subject examinations [Table 5]. Conversely, the numbers of users with a score < 50% suggests that few users were just randomly marking answers [Table 5]. Given that a correct answer key was returned to users following submission of the examination for scoring, it is possible that some users may have retaken an examination one or more times to achieve a higher score, and that skewed the average higher. However, these examinations are not "high-stakes" for a grade, so there is no tangible reward to the user for obtaining a higher score. If users are striving for a higher score, then that contributes to their education and fulfills a goal of the web site.

The variation in scores obtained per examination is primarily due to the nature of the mix of questions on these relatively short examinations. One or two more difficult questions on a 10 questions examination will reduce the average scores achieved [Table 3]. However, none of the examinations was a statistically significant outlier. The most difficult of the 290 total questions, scoring at 38.8% overall, in the cardiovascular subject examination had an image showing a dense collagenous scar with remote myocardial infarction, and prompted for the most likely current correlate, which was congestive heart failure, but 37.4% of users picked the foil for troponin elevation, reflective of an acute coronary syndrome. However, the foils were nonhomogeneous. The author has used the information from log file analysis to revise the most difficult questions.

In this study, usage was higher for examinations at the top of the web page menu. To some extent this was a self-fulfilling prophecy, since the order of examinations in the menu, at least for the subject examinations, was arranged by the author according to past experience of subject popularity in a prior study of this web site. [13] Similar internet user behavior has been observed even for the web site of the College of American Pathologists (CAP), where a web page for complex cancer protocols, presumably of interest only to pathologists, recorded the highest number of "hits" for the adrenal protocol, even though this is an obscure site for malignancies (personal communication). The protocols were arranged alphabetically on that CAP web page, with adrenal first.

Based upon user behavior observed in this study, developers of web pages should place items of greatest interest or impact at the top of a menu. Designers of computer-aided instruction should bear in mind that, the longer the exercise, fewer users will complete it in the manner intended. Web page authors can evaluate whether developing more content is worth the additional effort, given a declining rate of usage for more items at the bottom of a menu. Providing more exercises of a similar nature does not guarantee equivalent usage at the web site.

Would whole-slide imaging (WSI) lend itself to this kind of examination format? High-stakes timed examinations for students generally provide additional time, such as 1.5 min/question with imaging, and less for text-only questions. However, use of WSI would require even more time, and reduce the total number of questions for examinations limited to a specific time period. Students in health science educational programs have a heavy curricular load, and pathology is just one of many subject areas that require testing.

Guided instruction with a small number of labeled pathologic images fits within the time available for group learning exercises, such as laboratory sessions, in most curricula. A laboratory session with an instructor manipulating WSI may be more efficient for instruction. Novice students are used to viewing video screens, and are more likely to tolerate such a medium than the multi headed microscope driven by a pathologist moving the slide at so rapid a rate that motion sickness develops in some students. However, there is no practical way to test on WSI with large class sizes, and so students can safely ignore it. Though WSI has been applied at the medical student level, it is more effectively suited to training programs in pathology, such as residencies and fellowships. [14] WSI has been evaluated for testing pathology residents taking specialty board examinations administered by the American Board of Pathology. [15]


   Conclusion Top


This study showed that web-based examination completion rates were higher for shorter examinations in a single subject area. Usage was higher for shorter 10questions examinations, and for examinations listed closer to the top of the web page menu. Users identifying themselves as MD/DO scored higher than other users, averaging 75%, but this was only slightly higher than the overall average for all users, regardless of their voluntarily stated health science field of study. Scores ≥90% were achieved by 21% of users for review examinations, and by 37% of users for subject examinations and suggest that there were serious users completing the examinations who had sufficient preparation to make use of the examinations to support their pathology education.

Lessons learned:

  • Examinations of shorter length have higher completion rates
  • Usage is highest for items nearer the top of a web page menu
  • Developing more examinations does not guarantee their equivalent usage
  • A cohort of users had scores ≥90%, validating the effort in web site development.



   Acknowledgement Top


The author wishes to thank Stephen Mossbarger, web programmer/analyst, Eccles Health Sciences Library, University of Utah, for his assistance with maintaining the WebPath web site.

 
   References Top

1.Marshall R, Cartwright N, Mattick K. Teaching and learning pathology: A critical review of the English literature. Med Educ 2004;38:302-13.  Back to cited text no. 1
    
2.Szymas J, Lundin M. Five years of experience teaching pathology to dental students using the WebMicroscope. Diagn Pathol 2011;6 Suppl 1:S13.  Back to cited text no. 2
    
3.National Board of Medical Examiners. USMLE Step 1 Content Outline 2013-2014. Available from: http://www.usmle.org/pdfs/step-1/2013mid May2014_Step1.pdf. [Last accessed on 2014 Jul 22].  Back to cited text no. 3
    
4.National Board of Medical Examiners. Item Writing Manual, 3 rd ed. Available from: http://www.nbme.org/PDF/ItemWriting_2003/2003IWGwhole.pdf. [Last accessed on 2014 Jul 22].  Back to cited text no. 4
    
5.Velan GM, Jones P, McNeil HP, Kumar RK. Integrated online formative assessments in the biomedical sciences for medical students: Benefits for learning. BMC Med Educ 2008;8:52.  Back to cited text no. 5
    
6.Olson BL, McDonald JL. Influence of online formative assessment upon student learning in biomedical science courses. J Dent Educ 2004;68:656-9.  Back to cited text no. 6
    
7.Hammoud MM, Barclay ML. Development of a Web-based question database for students' self-assessment. Acad Med 2002;77:925.  Back to cited text no. 7
    
8.Masiello I, Ramberg R, Lonka K. Learning in a web-based system in medical education. Med Teach 2005;27:561-3.  Back to cited text no. 8
    
9.Lewis PJ, Chen JY, Lin DJ, McNulty NJ. Radiology ExamWeb: Development and implementation of a national web-based examination system for medical students in radiology. Acad Radiol 2013;20:290-6.  Back to cited text no. 9
    
10.Stergiou N, Georgoulakis G, Margari N, Aninos D, Stamataki M, Stergiou E, et al. Using a web-based system for the continuous distance education in cytopathology. Int J Med Inform 2009;78:827-38.  Back to cited text no. 10
    
11.Wong G, Greenhalgh T, Pawson R. Internet-based medical education: A realist review of what works, for whom and in what circumstances. BMC Med Educ 2010;10:12.  Back to cited text no. 11
    
12.Eysenbach G, Wyatt J. Using the Internet for surveys and health research. J Med Internet Res 2002;4:E13.  Back to cited text no. 12
    
13.Klatt EC, Dennis SE. Web-based pathology education. Arch Pathol Lab Med 1998;122:475-9.  Back to cited text no. 13
    
14.Fung KM, Hassell LA, Talbert ML, Wiechmann AF, Chaser BE, Ramey J. Whole slide images and digital media in pathology education, testing, and practice: The Oklahoma experience. Anal Cell Pathol (Amst) 2012;35:37-40.  Back to cited text no. 14
    
15.Hamilton PW, Wang Y, McCullough SJ. Virtual microscopy and digital pathology in training and education. APMIS 2012;120:305-15.  Back to cited text no. 15
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4], [Figure 5]
 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5]


This article has been cited by
1 Pathology Resources
Kimberly K. Meeks
Journal of Electronic Resources in Medical Libraries. 2015; 12(4): 232
[Pubmed] | [DOI]



 

 
Top
  

    

 
  Search
 
   Browse articles
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    Abstract
   Introduction
   Subjects and methods
   Results
   Discussion
   Conclusion
   Acknowledgement
    References
    Article Figures
    Article Tables

 Article Access Statistics
    Viewed2040    
    Printed47    
    Emailed0    
    PDF Downloaded300    
    Comments [Add]    
    Cited by others 1    

Recommend this journal