Haptics for Tumour Detection During Minimally Invasive Surgery

  Read in   |   Print Print

"In offering an unrivalled functionality and integrating easily with third party technologies, National Instruments products helped us meet our project demands with space left for future system development."

- Earle Jamieson, University of Leeds

The Challenge:
Developing a system to measure and dynamically simulate the forces perceived by a surgeon during palpation (examination through touch) in robot-assisted surgery.

The Solution:
Using NI CompactDAQ hardware to create an automated physical palpation system for measuring response forces of silicone tissue models with embedded artificial tumours, the built-in graphics and interface tools of NI LabVIEW software to create a virtual surgical haptic system, and measured data from a physical testing system integrated into the haptic system to help users interact with a simulated diseased human tissue through touch.

James Chandler - University of Leeds
Matthew Dickson - University of Leeds
Earle Jamieson - University of Leeds
Thomas Mueller - University of Leeds
Thomas Reid - University of Leeds

Each year, more than 10 million people worldwide are diagnosed with cancer. More than one in three people develop some form of cancer in their lifetime and approximately one in four of all deaths are caused by it. Cancer commonly manifests as hard, abnormal masses (tumours) embedded within softer tissue (organs). In the case of malignant tumours, early detection and accurate removal increase the patient’s likelihood of survival. In recent years, we have seen surgical procedures transfer from traditional open surgery to minimally invasive surgery (MIS), and more recently, to robot-assisted laparoscopic surgery. These advances have shown significant benefits over open surgery, but the lack of direct physical contact has resulted in the loss of haptic (force and touch) feedback, which is required for assessing tissue features through palpation.

At The University of Leeds in the UK, we developed a simulation system that delivers haptic feedback to a user during a virtual MIS palpation exercise. Potential applications for the system include surgical training and further development into a master/slave palpation device. The long-term goal is to overcome the drawbacks of new technology used in surgery to detect and improve tumour resection accuracy through palpation. To achieve this, we required hardware I/O, third-party hardware interfacing, virtual graphics and custom data handling and processing. Other systems make use of a combination of embedded hardware and a range of programming environments, but we realised that we could achieve all of this functionality using just LabVIEW and NI CompactDAQ to deliver inherent compatibility between the various project functions.  

System Concept

To simulate the palpation of human tissue, LabVIEW was used to create a virtual environment that presents the user with a probe and tissue sample within a patient’s abdomen. A haptic device provides haptic interaction with the virtual environment. LabVIEW was also used to control a custom-built physical testing environment where silicon tissue models were palpated with a force sensing probe. The physical tests were primarily performed to validate the data obtained from a finite element analysis (FEA) and to establish communication between the physical testing environment and the haptic device as an opportunity to explore the system’s remote palpation capabilities. The response forces provided to the user in the LabVIEW virtual environment were determined using FEA.

The Physical Measurement System

To measure response forces from silicone tissue models during palpation, we developed a tri-axial Cartesian robotic system capable of moving an instrumented palpation probe relative to the tissue models. Using LabVIEW and NI CompactDAQ we were able to go from concept to solution in a matter of weeks. The system produces response surfaces of tissue models by recording force measurements during palpation at specified in-plane positions.

The NI CompactDAQ offered a quick and elegant method of sending signals to our motor controllers and allowed us to record position and force measurements. We programmed the system to run autonomously using a LabVIEW state machine architecture, so we could adjust parameters such as indentation depth and palpation resolution directly from the front panel.

The Haptic Surgical System

To simulate the visual and haptic aspects of palpation during surgery, we created a bespoke DLL to interface with the haptic device (PHANToM Omni, SensAble Technologies). This allows two-way communication between LabVIEW and the OpenHaptics API to perform functions such as measuring the device end-effector position and programmatically implementing force through the device. The “call library function node” exports and imports data to and from the DLL to set up the required parameters for the system. This means developers can access the device’s functions and build ready-made subVIs to create flexible haptic scenes quickly and easily, without the need to access the low-level device functions.

Force is generated by sending predetermined forcing variables to the DLL from LabVIEW. These are then implemented dynamically using a Gaussian function to generate a force in a haptic control loop that operates at a 1 kHz frequency. A stiffness function (based on Hooke’s Law, F = kx) is then used to adjust the force as a function of the indentation depth. This results in the generation of high-fidelity haptic feedback giving smooth force during tissue interaction. The LabVIEW 3D Toolkit was used to create the visual scene, which includes a deformable tissue surface under manipulation of a robotic probe. A height array is programmatically updated depending on the position of the end effector to deliver representative visual deformation of the surface. Objects used within the final visualisation use the virtual reality modeling language (VRML) CAD geometry files to increase the quality of the rendered scene. Coupling the user’s sense of touch with visual feedback in this way mimics real-world physical interaction.

To test the final system and assess how well users could detect tumors within the virtual tissues, we carried out a human factors study. This was automated within the code, allowing randomised tissue surfaces to be loaded automatically and other variables to be controlled programmatically, altogether improving the validity of our statistical results. LabVIEW made it easy to implement and customise our trials, allowing robust data handling and postprocessing.

Advantages of the NI Solution

NI hardware and LabVIEW gave us a solution that exceeded all the expectations of our project. By offering unrivalled functionality and easy integration with third-party technologies, National Instruments products helped us meet our project demands with space left for future system development. In addition, National Instruments has an excellent support network, from their dedicated support personnel to the extensive website; it was always possible to get the advice we needed.

Author Information:
James Chandler
University of Leeds
University of Leeds, Woodhouse Lane
Leeds LS2 9JT
United Kingdom

Bookmark and Share

Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community

Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI