Session Interaction & Evaluation

Don’t miss the complete programme of VCBM 2018!

Thursday 20th september, 11:30-13:15. Room Andalucia I

Interaction Techniques for Immersive CT Colonography: A Professional Assessment
MICCAI invited talk

Daniel Lopes, Daniel Medeiros, Soraia Paulo, Pedro Borges, Vitor Nunes, Vasco Mascarenhas, Marcos Veiga, Joaquim Jorge

CT Colonography (CTC) is considered the leading imaging technique for colorectal cancer (CRC) screening. However, conventional CTC systems rely on clumsy 2D input devices and stationary flat displays that make it hard to perceive the colon structure in 3D. To visualize such anatomically complex data, the immersion and freedom of movement afforded by Virtual Reality (VR) systems bear the promise to assist clinicians to improve 3D reading, hence, enabling more expedite diagnoses. To this end, we propose iCOLONIC a set of interaction techniques using VR to perform CTC reading. iCOLONIC combines immersive Fly-Through navigation with positional tracking, multi-scale representations and mini-maps to guide radiologists and surgeons while navigating throughout the colon. Contrary to stationary VR solutions, iCOLONIC allows users to freely walk within a work space to analyze both local and global 3D features. To assess whether our non-stationary VR approach can assist clinicians to improve 3D colon reading and 3D perception, we conducted a user study with three senior radiologists, three senior general surgeons and one neuroradiology intern. Results from formal evaluation sessions demonstrate iCOLONICs usability and feasibility as the proposed interaction techniques were seen to improve spatial awareness and promote a more fluent navigation. In addition, participants referred that our approach shows great potential to speed up the screening process.

ICG based Augmented-Reality-System for Sentinel Lymph Node Biopsy
Short Paper

Matthias Noll, Werner Noa-Rudolph, Stefan Wesarg, Michael Kraly, Ingo Stoffels, Joachim Klode, Cédric Spass, and Gerrit Spass

In this paper we introduce a novel augmented-reality (AR) system for the sentinel lymph node (SLN) biopsy. The AR system consists of a cubic recording device with integrated stereo near-infrared (NIR) and stereo color cameras, a head mounted AR display (HMD) for visualizing the SLN information directly into the physicians view and a controlling software application. The labeling of the SLN is achieved using the fluorescent dye indocyanine green (ICG). The dye accumulates in the SLN where it is excited to fluorescence by applying infrared light. The fluorescence is recorded from two directions by the NIR stereo cameras using appropriate filters. Applying the known rigid camera geometry, an ICG depth map can be generated from the camera images, thus creating a live 3D representation of the SLN. The representation is then superimposed to the physicians field of view, by applying a series of coordinate system transformations, that are determined in four separate system calibration steps. To compensate for the head motion, the recording systems is continuously tracked by a single camera on the HMD using fiducial markers. Because no additional monitors are required using the system, the physicians attention is kept solely on the operation site, making the intervention faster and safer for the patient.

A prototype Holographic Augmented Reality Interface for Image-Guided Prostate Cancer Interventions
Short Paper

Cristina Morales Mojica, Jose Velazco, Nikhil Navkar, Shidin Balakrishnan, Julien Abinahed, Walid El Ansari, Khalid Al-Rumaihi, Adham Darweesh, Abdulla Al Ansari, Mohamed Gharib, Mansour Karkoub, Ernst Leiss, Ioannis Seimenis, Nikolaos Tsekos

Motivated by the potential of holographic augmented reality (AR) to offer an immersive 3D appreciation of morphology and anatomy, the purpose of this work is to develop and assess an interface for image-based planning of prostate interventions with a head-mounted display (HMD). The computational system is a data and command pipeline that links a magnetic resonance imaging (MRI) scanner/data and the operator, that includes modules dedicated for image processing and segmentation, structure rendering, trajectory planning and spatial co-registration. The interface was developed with the Unity3D Engine (C#) and deployed and tested on a HoloLens HMD. For ergonomics in the surgical suite, the system was endowed with hands-free interactive manipulation of images and the holographic scene via hand gestures and voice commands. The system was tested in silico using MRI and ultrasound datasets of prostate phantoms. The holographic AR scene rendered by the HoloLens HMD was subjectively found superior to desktop-based volume or 3D rendering in regard to structure detection and appreciation of spatial relationships, planning access paths and manual co-registration of MRI and Ultrasound. By inspecting the virtual trajectory superimposed to rendered structures and MR images, the operator observes collisions of the needle path with vital structures (e.g. urethra) and adjusts accordingly. Holographic AR interfacing with wireless HMD endowed with hands-free gesture and voice control is a promising technology. Studies need to systematically assess the clinical merit of such systems and needed functionalities.

Visual Analytics in histopathology diagnostics: a protocol-based approach
Full paper

Alberto Corvò, Michel A. Westenberg, Marc A. van Driel, and Jarke J.van Wijk

Computer-Aided-Diagnosis (CAD) systems supporting the diagnostic process are widespread in radiology. Digital Pathology is still behind in the introduction of such solutions. Several studies investigated pathologists’ behavior but only a few aimed to improve the diagnostic and report process with novel applications. In this work we designed and implemented a first protocol-based CAD viewer supported by visual analytics. The system targets the optimization of the diagnostic workflow in breast cancer diagnosis by means of three image analysis features that belong to the standard grading system (Nottingham Histologic Grade). A pathologist’s routine was tracked during the examination of breast cancer tissue slides and diagnostic traces were analyzed from a qualitative perspective. Accordingly, a set of generic requirements was elicited to define the design and the implementation of the CAD-Viewer. A first qualitative evaluation conducted with five pathologists shows that the interface suffices the diagnostic workflow and diminishes the manual effort. We present promising evidence of the usefulness of our CAD-viewer and opportunities for its extension and integration in clinical practice. As a conclusion, the findings demonstrate that it is feasibile to optimize the Nottingham Grading workflow and, generally, the histological diagnosis by integrating computational pathology data with visual analytics techniques.

Automatic Generation of Web-Based User Studies to Evaluate Depth Perception in Vascular Surface Visualizations
Full Paper

Monique Meuschke, Noeska Smit, Nils Lichtenberg, Bernhard Preim, Kai Lawonn

User studies are often required in biomedical visualization application papers in order to provide evidence for the utility of the presented approach. An important aspect is how well depth information can be perceived, as depth encoding is important to enable an understandable representation of complex data. Unfortunately, in practice there is often little time available to perform such studies, and setting up and conducting user studies may be labor-intensive. In addition, it can be challenging to reach enough participants to support the contribution claims of the paper. In this paper, we propose a system that allows biomedical visualization researchers to quickly generate perceptual task-based user studies for novel surface visualizations, and to perform the resulting experiment via a web interface. This approach helps to reduce effort in the setup of user studies themselves, and at the same time leverages a web-based approach that can help researchers attract more participants to their study. We demonstrate our system using the specific application of depth judgment tasks to evaluate vascular surface visualizations, since there is a lot of recent interest in this area. However, the system is also generally applicable for conducting task-based user studies in biomedical visualization.

A Critical Analysis of the Evaluation Practice in Medical Visualization
Full Paper

Bernhard Preim, Timo Ropinski, Petra Isenberg

Medical visualization aims at directly supporting physicians in diagnosis and treatment planning, students and residents in medical education, and medical physicists as well as other medical researchers in answering specific research questions. For assessing whether single medical visualization techniques or entire medical visualization systems are useful in this respect, empirical evaluations involving participants from the target user group are indispensable. The human computer interaction field developed a wide range of evaluation instruments, and the information visualization community more recently adapted and refined these instruments for evaluating (information) visualization systems. However, often medical visualization lacks behind and should pay more attention to evaluation, in particular to evaluations in realistic settings that may assess how visualization techniques contribute to cognitive activities, such as deciding about a surgical strategy or other complex treatment decisions. In this vein, evaluations that are performed over a longer period are promising to study, in order to investigate how techniques are adapted. In this paper, we discuss the evaluation practice in medical visualization based on selected examples and contrast these evaluations with the broad range of existing empirical evaluation techniques. We would like to emphasize that this paper does not serve as a general call for evaluation in medical visualization, but argues that the individual situation must be assessed and that evaluations when they are carried out should be done more carefully.

Don’t miss the complete programme of VCBM 2018!