Journal of Pathology Informatics Journal of Pathology Informatics
Contact us | Home | Login   |  Users Online: 264  Print this pageEmail this pageSmall font sizeDefault font sizeIncrease font size 




 
Table of Contents    
SYMPOSIUM - 2ND NORDIC SYMPOSIUM ON DIGITAL PATHOLOGY
J Pathol Inform 2015,  6:7

A comparative study of input devices for digital slide navigation


1 Centre for Medical Image Science and Visualization, Linköping University; Chalmers University of Technology, Gothenburg, Sweden
2 Centre for Medical Image Science and Visualization, Linköping University, Linköping, Sweden
3 Chalmers University of Technology, Gothenburg, Sweden

Date of Submission25-Nov-2014
Date of Acceptance25-Nov-2014
Date of Web Publication24-Feb-2015

Correspondence Address:
Jesper Molin
Centre for Medical Image Science and Visualization, Linköping University; Chalmers University of Technology, Gothenburg
Sweden
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/2153-3539.151894

Rights and Permissions
   Abstract 

This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Quick and seamless integration between input devices and the navigation of digital slides remains a key barrier for many pathologists to "go digital." To better understand this integration, three different input device implementations were compared in terms of time to diagnose, perceived workload and users' preferences. Six pathologists reviewed in total nine cases with a computer mouse, a 6 degrees-of-freedom (6DOF) navigator and a touchpad. The participants perceived significantly less workload (P < 0.05) with the computer mouse and the 6DOF navigator, than with the touchpad, while no effect of the input device used on the time to diagnose was observed. Five out of six pathologists preferred the 6DOF navigator, while the touchpad was the least preferred device. While digital slide navigation is often designed to mimic microscope interaction, the results of this study demonstrate that in order to minimize workload there is reason to let the digital interaction move beyond the familiar microscope tradition.

Keywords: Digital pathology, multi-scale navigation, usability, workload


How to cite this article:
Molin J, Lundström C, Fjeld M. A comparative study of input devices for digital slide navigation. J Pathol Inform 2015;6:7

How to cite this URL:
Molin J, Lundström C, Fjeld M. A comparative study of input devices for digital slide navigation. J Pathol Inform [serial online] 2015 [cited 2019 Dec 15];6:7. Available from: http://www.jpathinformatics.org/text.asp?2015/6/1/7/151894


   Background Top


This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden.

The digitization of pathology departments continues to develop. Initial use consisted of somewhat sporadic activities such as telepathology, education, or quality assurance. [1] The systems have continued to mature, and whole slide imaging is becoming a routine tool for primary diagnostics. [2],[3] From a usability perspective, sporadic use requires systems that are self-instructive and easy to use. Initial image viewers were, therefore, typically controlled using a computer mouse and a graphical user interface with symbolic icons familiar from traditional microscopy viewing such as different lenses and virtual imitations of glass trays. To ease the transition even further, it has been proposed that devices that mimic the microscope stage could be used to navigate digital slides. [4] While such a design strategy might lower the initial threshold going from analog to digital microscopy, it is an approach also coming with numerous problems: the consistency of the digital interface is broken, the digital design space becomes restricted, and design problems from the analog space are translated into the digital space. [5] Still, initial digital viewers of computed tomography images made use of this strategy. At first, the images were reviewed side-by-side, mimicking the behavior of film alternators. Later, another method was developed, reviewing the images using a stack metaphor, which was more suited for use with computer workstations. [6] The fact that the stack metaphor is prevalent today shows that familiarity with traditional interaction methods loses importance over time.

This view was confirmed in a recent questionnaire-based study covering current practices and perceptions of digital pathology in the United States. The most common usage of the digital technology was for teaching and tumor boards. Current viewers were perceived as being easy to learn to use but too slow for routine work. [7] In another study, among users starting to work routinely, faster navigation with more suitable input devices was at the top of the wish list. [3] On this background, this study deliberately avoided microscope metaphors when they were suspected to be inappropriate, in order to investigate if that could improve the satisfaction regarding navigation of digital slides. We tailored an application for three existing alternative input devices, each working with our prototypical workstation application aiming to compare these three devices' navigation performance and usability for routine digital pathology review.

Design Process

As a starting point for our work, a systematic analysis was performed of how current viewing systems work in relation to the navigational needs of the pathologist. When presenting the outcome, a few terms describing the nature of the interaction will be used as follows. The input space of an interaction is the extent of the user movement, in this case of hands and fingers. The output space is the extent of the field of view movement in the viewing system, in this case panning distance and zoom depth. Two types of interaction control are position control, meaning that a change in position on the input side has a direct (possibly amplified) effect in the output space, and rate control where an input offset is instead connected to the speed of movement in the output space, the typical function of a joystick or steering pin.

Changes of the magnification level (zooming) and the position of the slide (panning) can be characterized in the following way for the microscope and digital viewing, respectively:

  1. In the microscope, the magnification is changed in fixed steps around the center of the field of view. In digital slide navigation there are further possibilities, such as a smooth transition between the magnification changes and centering the zooming on the pointer position.
  2. The input space for a microscope is relatively small compared to a typical computer mouse, regardless of whether the movement is done by stage knobs or fingers on the glass slide. A computer mouse can use the full size of the desk space or mouse mat where it rests. Then again, other computer input devices can have a smaller input space, comparable to the one of the microscope.
  3. The output space for panning is dependent on the magnification used. It is large for high magnification levels, that is, there are long panning distances, but small for low magnification levels. The output space is similar for the microscope and digital viewing, but is in the digital case dependent on the monitor characteristics.
  4. With a microscope, the output is connected to the input by position control. For the interaction in digital viewing, it is a design choice whether to use position or rate control.


Thus, the interaction with a microscope is limited by physical constraints that are difficult to modify. Conversely, the interaction in a digital workstation can be designed much more freely if old metaphors are left behind. For example, by increasing the size of the display, it is possible to make the output space smaller, which decreases the amount of user input needed. This has been shown to decrease the time to diagnose for diagnostic tasks that take advantage of the smaller output space such as the initial global impression when a slide is first opened. [8]

In this study, we focused on creating interaction solutions for three of the listed aspects of slide navigation: (1) How the magnification is changed, (2) how the size can be modified, and (3) the how the input is connected to the output. The approach taken was to adapt already existing input devices to suit the work of pathologists. We developed prototype implementations of the following devices: A standard 2-button computer mouse with a clickable scroll wheel (Premium Optical Wheel Mouse, Logitech), a touchpad (Intuos CTH-480, Wacom) and a 6 degrees-of-freedom (6DOF) navigator (Space Mouse Pro, 3DConnexion). As with most digital technology, the function of these devices can be modified in software, which we took advantage of since initial testing showed poor performance using their default functionality. The functionality of the different devices was developed as follows:

Computer Mouse

The functionality for the computer mouse was derived from current practice for in map navigation and other zoomable user interfaces. An overview with a draggable square representing the viewport was used in order to increase the spatial awareness since this feature was popular for navigation in existing digital viewers. [3] A lock mode was implemented in order to deal with the usability issues that arise when the magnification difference between the overview and the main view is too large [9] and to enable continuous panning. The lock mode, initiated by a mouse wheel click, consisted of hiding the mouse pointer and translating the mouse movements directly to panning movements with an amplification factor of 4. The chosen factor is a result of a trade-off between effectively dealing with the limited input space of the computer mouse, and avoiding too much jittering due to hand vibrations being amplified. Zooming was achieved using the scroll wheel of the mouse, which initiated a change in magnification centered around the mouse pointer. This is contrary to the microscope, but in line with what is recommended for digital map navigation. [10]

Touchpad

The size of the input space of the touchpad is even smaller than the input space of the computer mouse. In order to avoid extensive clutching (lifting the finger at the border of the touchpad and starting over on the opposite side), [11] rate control was used instead of position control. This makes sense for pathology since the most common task is to slowly pan over large distances, which would otherwise require intensive work with the hand. On the other hand, rate control makes it harder to execute quick movements like switching between multiple sections on the same slide. To better support this need, pan jumping functionality was implemented for that need. By double-clicking and keeping the finger down, a mode was activated where the main view immediately zooms out and temporarily switches to position control. Since the zooming out decreases the size of the output space, it is then possible to pan to the desired location. By releasing the finger, the main view immediately zooms in again to the initial zoom level but at the new position.

To quickly being able to switch between digital slides within a case, we implemented a functionality where swiping the finger over the lower edge towards the active area of the touchpad activates a slide selection mode. By moving the finger left and right, the pathologist can switch between available digital slides.

6 degrees-of-freedom navigator

The input space on the 6DOF navigator is even smaller than on the touchpad, so rate control is, therefore, necessary. Initial user testing showed, however that normal rate control made it hard to switch between sections within the same slide. With the 6DOF navigator, it was hard to implement the same pan jumping functionality as the touchpad due to a lack of ways to initiate mode switches. Instead, a nonlinear rate control was implemented. For small input movements, linear control was maintained. For larger input movements, typically for rates above the limit where motion blur makes it hard to see the image, the rate was squared allowing quick movements in the output space.

Another issue detected from initial user testing was that zooming by dragging the puck of the 6DOF navigator outward was perceived as strenuous. As a remedy, zooming by rotating the puck was implemented.

The functionality of all input devices is summarized in [Figure 1]. To put the devices into the context of other possible input devices, the size of the input space, the control order and the amplification is provided in [Table 1].
Figure 1: A summary of the functionality of the three devices used in the study

Click here to view
Table 1: Rough classification of different input devices for pathology

Click here to view



   Methods Top


This study was approved by the local institutional review board (2013/195-31).

The three different input device implementations were evaluated in a within-group balanced study design. Six pathologists were recruited from the pathology departments at Gävle Hospital, Linköping University Hospital, and Sahlgrenska University Hospital by E-mail request; two from each hospital. Of these six participants, two were consultant pathologists, three were last year trainees, and one was a 2 nd year trainee.

The cases were diagnosed using the prototype workstation software. To review the digital slides, a Dell UltraSharp U2713HM, 27", 4-megapixel display was used. The user interface of a large display provided a main view, a navigation overview, and clickable thumbnails of slides corresponding to the open case.

Task

With each input device, the participants diagnosed a set of three different cases of different tissue types. Each case was reviewed, and the participant was asked to state a diagnosis or to make a diagnostic plan, e.g., to order more stains or ask an expert. The participants were informed that the duration of the review was recorded in order to put a small amount of time pressure on the participants, but they were also instructed that the accuracy of the diagnostic statement was more important than the speed of the review. A total of three case sets was used and each set consisted of similar but different cases: A breast core needle biopsy (1 slide), a skin excision (3 slides), and a prostate core needle biopsy (6 slides). The cases were selected so that subspecialty training for the case types used would not be required to review them. The purpose of this selection was to avoid large differences in strategy and time to diagnose while being able to run the study during one session. With six participants, we chose to fully counter-balance for the order in which the input devices were used while making sure that each case set was used with each input device the same number of times. The setup is given in [Table 2].
Table 2: Counterbalancing strategy used in the study

Click here to view


Before each trial, the participants received training in best practices using the devices, trying out different navigation techniques and being given time to familiarize themselves with the devices. The participants were asked to practice until they felt comfortable using each device, and then to start diagnosing a test case. This training process typically lasted for 10 min.

A trial was started by opening the case and starting a clock. The participant decided when he or she was done by saying so out loud, and the clock was stopped. This was followed by the participant stating the diagnosis out loud. The accuracy of the diagnosis was not analyzed. The total time per case was recorded and normalized time per case derived (the time spent as fraction of the average time spent per case by all pathologists in order to compensate for normal variations in case difficulty. [12]

After each trial, the participant filled in an NASA-TLX task survey [13] translated into Swedish. This survey is an established method to quantify perceived workload and has been used in thousands of studies. [14] This was then followed by a short semi-structured interview based on six quality factors for navigation in large spaces based on work of Bowman et al. [15] and Stellmach and Dachselt [16] speed, accuracy, spatial awareness, ease of use, information gathering, and well-being. After all trials had been completed, an NASA-TLX weighting factor survey was completed, and the participants were asked to rank the input devices in order of preference and explain why. Statistical analysis was performed with the Scipy Stats software (http://www.scipy.org, v0.14.0) using a significance level of P < 0.05.


   Results Top


Overall, all participants reused the same diagnostic strategy for the same type of case between the different conditions. One obvious outlier was removed from the data: One pathologist used a very careful review strategy for the first case but then explicitly stated that a faster strategy would be used from then on, and the time spent for the remaining two conditions was indeed less than half of the first one. Having removed this outlier, no significant effect of the order and case set used on the time to diagnose was found, (F2,34 = 1.07, P > 0.05).

Using a one-way ANOVA with three independent groups, a significant effect of input device on the weighted NASA-TLX index was found (F2,10 = 3.76, P < 0.05). Using Bonferroni corrected two-sided t-test, both the 6DOF navigator and the computer mouse had a significantly lower perceived workload than the touchpad (P < 0.05). The weighted NASA-TLX index values are reported in [Figure 2], and the different subscales in [Figure 3].
Figure 2: Difference in NASA task load index between the three different conditions using three different input devices

Click here to view
Figure 3: NASA task load index factors. CM: Computer mouse, 6D: 6 degree-of-freedom navigator, TP: Touchpad

Click here to view


There was no significant effect of input device on total time per case, (F2,34 = 0.06, ns), and no significant effect of input device on normalized time per case, (F2,34 = 0.52, ns). No effect of order within a trial on the normalized time was found for any of the devices: mouse: (F2,10 = 0.67, ns); touchpad: (F2,10 = 0.09, ns); 6DOF navigator: (F2,10 = 0.09, ns). The pathologists spent in median 68.0s per slide with an inter-quartile range of (48.0s, 108s). The median and inter-quartile range of slide times and normalized times are reported in [Table 3]. Statements based on quality factors were summarized and are reported in [Table 4]. Note that the nature of the data is anecdotal, so it is hard to weight the importance of the different statements.
Table 3: Time spent per case using different input devices

Click here to view
Table 4: Summary of participants' statements per quality factor and device

Click here to view


Five out of the six pathologists ranked the 6DOF navigator as their preferred navigation device. Most thought that the computer mouse and the 6DOF navigator were equally good in this trial, but preferred the 6DOF navigator because it was expected to be more comfortable to use long-term.


   Conclusions Top


The results in this study indicate that workstation design offers more challenges than application features and diagnostic speed. The touchpad condition is a good example of this since its individual features were liked, and the speed of diagnosis was on a comparable level to the other devices. At the same time, for five out the six pathologists it was the least preferred device and its perceived workload for diagnosing cases was significantly higher than for the other devices.

In related studies, within laparoscopic surgery, [17] a laboratory vigilance task, [18] and two-dimensional navigation tracking task, [19] perceived workload have been associated with a decrease in different measures of performance, which is in contrast to the findings in this study where no effect on time performance was found. However, diagnostic performance was not measured in this study. An ideal input device should enable ease of navigation so that the pathologist's effort can be put into the review of the case, and therefore, a low perceived workload is something to strive for.

The participants only practiced for around 10 min, which might cause a bias in favor of the computer mouse since that should have been more familiar. However, it was not possible to see an increase in time performance throughout the trial. Hence, 10 min of practice was sufficient to remove any large learning effects on the time performance. For the perceived workload measure, it was not possible to control for a possible learning effect. However, the NASA-TLX questionnaires were filled in at the end of each trial and should, therefore, reflect the performance after around 30 min of practice. This can be compared to the results of a study of perceived workload, wherein a laboratory tracking task with acceleration control most of the learning effect on the perceived workload had disappeared after 10 min of practice, but continued to decline slowly. [20] While the workload of the touchpad, therefore, can be assumed to continue to decrease with more practice, there is currently no rationale indicating that it would reach a lower workload than the other devices.

The study also highlights the difference in perception of the system within different time scales. In an ultra-short-term time scale like a typical sales demonstration, the distinct features are the most important. [21] In a short-term time scale like this study, the immediate simplicity and comfort are valued, whereas in the long-term perspective ergonomic issues come into play. This was expressed by a few pathologists who stated that while the computer mouse worked well in the test situation, it would be worse to use it every day.

Measuring time performance might be problematic since participants risk interpreting the study situation differently and thus perform the task under nonuniform time pressure. It is, therefore, important to explicitly state how the participant should relate to time. The instruction used in this study resulted in consistent goal-directed behavior between conditions. At the same time, the participants did not perceive a high temporal demand which might otherwise cause unwanted interaction with other measurements.

Overall, it can be concluded that the 6DOF navigator that was examined here outperformed the two alternative devices. The 6DOF navigator was preferred by five out of six pathologists. Future studies could investigate other devices such as trackballs or stage knob imitations and should especially focus on the effects of long-term use and ensure that a device can provide sufficient functionality when more advanced tasks such as making annotations and measurements are included in the scope.

 
   References Top

1.
Pantanowitz L, Valenstein PN, Evans AJ, Kaplan KJ, Pfeifer JD, Wilbur DC, et al. Review of the current state of whole slide imaging in pathology. J Pathol Inform 2011;2:36.  Back to cited text no. 1
[PUBMED]  Medknow Journal  
2.
Stathonikos N, Veta M, Huisman A, van Diest PJ. Going fully digital: Perspective of a Dutch academic pathology lab. J Pathol Inform 2013;4:15.  Back to cited text no. 2
[PUBMED]  Medknow Journal  
3.
Thorstenson S, Molin J, Lundström C. Implementation of large-scale routine diagnostics using whole slide imaging in Sweden: Digital pathology experiences 2006-2013. J Pathol Inform 2014;5:14.  Back to cited text no. 3
[PUBMED]  Medknow Journal  
4.
Al-Janabi S, Huisman A, Vink A, Leguit RJ, Offerhaus GJ, ten Kate FJ, et al. Whole slide images for primary diagnostics of gastrointestinal tract pathology: A feasibility study. Hum Pathol 2012;43:702-7.  Back to cited text no. 4
    
5.
Sharp H, Rogers JP. Understanding and conceptualizing Interaction. Interaction Design: Beyond Human-Computer Interaction. 2 nd ed. p. 61-63;2007.  Back to cited text no. 5
    
6.
Beard DV, Molina PL, Muller KE, Denelsbeck KM, Hemminger BM, Perry JR, et al. Interpretation time of serial chest CT examinations with stacked-metaphor workstation versus film alternator. Radiology 1995;197:753-8.  Back to cited text no. 6
    
7.
Onega T, Weaver D, Geller B, Oster N, Tosteson AN, Carney PA, et al. Digitized whole slides for breast pathology interpretation: Current practices and perceptions. J Digit Imaging 2014;27:642-8.  Back to cited text no. 7
    
8.
Randell R, Ambepitiya T, Mello-Thoms C, Ruddle RA, Brettle D, Thomas RG, et al. Effect of display resolution on time to diagnosis with virtual pathology slides in a systematic search task. J Digit Imaging 2014.  Back to cited text no. 8
    
9.
Hornbæk K, Bederson B, Plaisant C. Navigation patterns and usability of zoomable user interfaces with and without an overview. ACM Trans Comput Interact 2002;9:362-89.  Back to cited text no. 9
    
10.
May J, Gamble T. Collocating Interface Objects: Zooming into Maps. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; 2014. p. 2085-94.  Back to cited text no. 10
    
11.
Wozniak P, Fjeld M, Zhao S. Limiting Trial and Error: Introducing a Systematic Approach to Designing Clutching. Proceedings of the Second International Symposium of Chinese CHI; 2014. p. 35-9.  Back to cited text no. 11
    
12.
Randell R, Ruddle RA, Mello-Thoms C, Thomas RG, Quirke P, Treanor D. Virtual reality microscope versus conventional microscope regarding time to diagnosis: an experimental study. Histopathology 2013;62:351-8.  Back to cited text no. 12
    
13.
Hart SG, California MF, Staveland LE. Development of NASA-TLX (Task Load Index): Results of emprical and theoretical research. Adv Psychol 1988;52:139-83.  Back to cited text no. 13
    
14.
Hart SG. Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting; 2006. p. 904-8.  Back to cited text no. 14
    
15.
Bowman DA, Koller D, Hodges LF. Travel in Immersive Virtual Environments : An Evaluation Motion Control Techniques of Viewpoint. Proceedings of the Virtual Reality Annual international Symposium; 1997. p. 45-52.  Back to cited text no. 15
    
16.
Stellmach S, Dachselt R. Investigating Gaze-Supported Multimodal Pan and Zoom. Proc Symp Eye Track Res Appl-ETRA '12; 2012. p. 357-60.  Back to cited text no. 16
    
17.
Yurko YY, Scerbo MW, Prabhu AS, Acker CE, Stefanidis D. Higher mental workload is associated with poorer laparoscopic performance as measured by the NASA-TLX tool. Simul Healthc 2010;5:267-71.  Back to cited text no. 17
    
18.
Warm JS, Parasuraman R, Matthews G. Vigilance requires hard mental work and is stressful. Hum Factors 2008;50:433-41.  Back to cited text no. 18
    
19.
Hancock PA. Effects of control order, augmented feedback, input device and practice on tracking performance and perceived workload. Ergonomics 1996;39:1146-62.  Back to cited text no. 19
    
20.
Hancock PA, Robinson MA, Chu AL, Hansen DR, Vercruyssen M, Grose E, et al. The Effects of practice on tracking and subjective workload. Proc Hum Factors Ergon Soc Annu Meet 1989;33:1310-4.  Back to cited text no. 20
    
21.
Jorritsma W, Cnossen F, van Ooijen PM. Merits of usability testing for PACS selection. Int J Med Inform 2014;83:27-36.  Back to cited text no. 21
    


    Figures

  [Figure 1], [Figure 2], [Figure 3]
 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4]


This article has been cited by
1 The Pathologist 2.0: An Update on Digital Pathology in Veterinary Medicine
Christof A. Bertram,Robert Klopfleisch
Veterinary Pathology. 2017; 54(5): 756
[Pubmed] | [DOI]
2 Implementation of Whole Slide Imaging for Clinical Purposes: Issues to Consider From the Perspective of Early Adopters
Andrew J. Evans,Mohamed E. Salama,Walter H. Henricks,Liron Pantanowitz
Archives of Pathology & Laboratory Medicine. 2017; 141(7): 944
[Pubmed] | [DOI]



 

 
Top
  

    

 
  Search
 
   Browse articles
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    Abstract
   Background
   Methods
   Results
   Conclusions
    References
    Article Figures
    Article Tables

 Article Access Statistics
    Viewed1874    
    Printed49    
    Emailed1    
    PDF Downloaded346    
    Comments [Add]    
    Cited by others 2    

Recommend this journal