Developing a Robotic Manipulator for Cancer Therapy Using Graphical System Design
NI LabVIEW is used to control the robotic arm responsible for extremely precise photodynamic therapy application for cancer patients.
"Using LabVIEW, we get a smoother move resulting in less abrupt transitions, which both saves time and improves performance."
- Assad Kallassy, Lebanese University - Second Branch
Developing an automated robotic manipulator for performing photodynamic therapy (PDT) on cancer patients.
Using graphical system design to design a robot capable of precise movement and highly accurate placement of PDT therapy.
Assad Kallassy - Lebanese University - Second Branch
Houssam Bitar - Lebanese University - Second Branch
Georges Issa - Lebanese University - Second Branch
When treating cancer, oncologists select from a number of techniques depending on the type and stage of the tumor in question. The most common techniques used today are photodynamic therapy, surgery, radiation therapy, chemotherapy, hormone therapy, and immunotherapy.
PDT is a special form of phototherapy, a term comprising all treatments that use light to induce beneficial reactions in a patient’s body. PDT is a new technique capable of destroying unwanted tissue while sparing normal tissue.
During PDT treatment, a drug called a photosensitizer is administrated to the patient by injection. The photosensitizer alone is harmless and has no effect on either healthy or abnormal tissue. However, when light emitted by a laser is directed at the tissue containing the drug, the drug is activated and the tissue is rapidly destroyed precisely where the light has been directed. This technique allows for a focused targeting of the abnormal tissue with careful application of the light beam, which translates into more effective treatment.
At the Lebanese University, we have developed an automated robotic mechanical manipulator whose primary function consists of skimming along the patient’s skin while performing the PDT technique. The robot moves the laser heads over the affected area of the patient’s body in certain geometrical designs, such as circular or elliptical shapes, so that the tumor can be destroyed.
Achieving a geometrical shape over a patient’s body requires five movements:
- Three translations whose functions are defined as follows:
- Z provides the vertical control of the treating laser heads
- Two rotations:
To achieve these five movements, five corresponding stepper motors must be controlled by the command signals generated by the command system and delivered by the electrical circuits to the motor drivers.
National Instruments LabVIEW directly controls four stepper motors (X, Y, θ, and Φ); a Microchip Technology PICmicro microcontroller controls the fifth motor (Z). The NI PCI-7334 motion controller uses a dual-processor architecture – a central processing unit (CPU) and a digital signal processor (DSP) form the backbone of the motion controller.
At the motion driver software level, the PCI-7334 uses commands coded in NI LabVIEW along with configuration settings from the Measurement & Automation Explorer (MAX) as roadmaps to generate command signals to move the motors. In the MAX configuration, we use the CW/CCW pulse stepper output configuration; the first output produces CW pulses when moving clockwise, while the second output produces CCW pulses when moving counterclockwise.
In the head of our robot, eight optical on/off sensors detect any object that appears within one centimeter in front of them to allow the distance between them and the surface right below to be revealed.
To protect the motion system from physical damage and to detect trajectory limits, each axis uses two physical limit switches, forward and reverse. All the sensors, limit switches, and motor drivers are connected directly to the PCI-7334 through an NI UMI-7764 motion interface that enables pin-level connectivity.
The motor drivers and limit switches of the X, Y, θ, and Φ axes are connected to the four motion I/O terminal blocks of the UMI-7764. In order to maintain a synchronous move of the five axes, the first and third UMI-7764 breakpoints are connected as inputs for the microcontroller that loads the fifth (Z axis) motor. Four of our sensors are connected to the analog input terminal block of the UMI-7764; the others are connected to the trigger/breakpoint terminal block. A joystick makes the system more user-friendly. At any time, the parameters can be modified, the system can be halted, and the position of the head can be adjusted. An SH68‑C68‑S cable connects the motion controller to the UMI-7764.
We began developing the software by simplifying our robot configuration into 2D applications and by simulating movement using LabVIEW. Then we extrapolated the same reasoning to a 3D problem and simulated the movement following the same process adopted in the simple 2D application. The software that runs the real robot is the 3D simulation itself, transformed into a program that reads from real sensors and runs real motors.
The main task of these programs lies in reading the sensor’s status (on/off) first, and then in defining the movements of the robot head.
Benefits of NI Products
Unlike text-based programming languages, LabVIEW uses icons instead of text lines in the creation of applications, which made software development significantly easier. Furthermore, LabVIEW contains a huge library that includes a large number of multipurpose subVIs like FlexMotion that we have used extensively in our software.
The PCI-7334 motion controller offers the performance and determination needed to solve the most complex motion applications, performing command fulfillment, host synchronization, I/O reaction, and system supervision. We get a smoother move resulting in less abrupt transitions, which both saves time and improves performance.
Explore the NI Developer Community
Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.
Who is National Instruments?
National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.