Teleoperated Rescue Robots: Using Haptic Feedback to Save Lives

 Read in  |   Print

"The University of Washington and BluHaptics worked closely with NI to develop the telerobotic portion of the Smart Emergency Response System for the SmartAmerica Challenge. NI’s real-time hardware, along with built-in robotics VIs, gave us an ideal platform for quickly prototyping the complex telerobotic control of the robotic system’s end effectors. NI technology also provides an ideal layer for cyber-physical system hardware and software implementation."

- Howard Jay Chizeck, University of Washington Department of Electrical Engineering

The Challenge:
Developing a precisely controlled and monitored teleoperated mobile robot to turn a gas valve to simulate and motivate teleoperated robotic devices for disaster response, such as a gas leak caused by an earthquake.

The Solution:
Using the NI cRIO-9024 real-time controller, NI LabVIEW Robotics Module, and youBot API to achieve quick, effective, and precise control of the KUKA youBot while simultaneously providing the operator with real-time robot status and incorporating haptic virtual fixtures into the operator interface to enhance operator performance.

Howard Jay Chizeck - University of Washington Department of Electrical Engineering
Kevin Huang - University of Washington Department of Electrical Engineering

The BioRobotics Lab at the University of Washington’s Department of Electrical Engineering works closely in the field of teleoperated robotic devices, and explores the utility of different modes and types of teleoperation across various applications. One such application is search and rescue operations, in which the telerobotic surrogate can replace the risk of losing the life of a human responder. Unfortunately, autonomy is oftentimes insufficient for dexterous and challenging robotic tasks. Teleoperation can improve the robotic task by augmenting control of the robot with human experience and expertise; thus, achieving synergy of human awareness and semantic understanding with the precision, scalability, and repeatability of machines.

The fundamental architecture of teleoperation includes two high-level components: the operator space and the remote environment. In the operator space, a human user manipulates a master robotic device. The commands from the operator space, often centered about the configuration of the master robotic device, are sent to the remote device, which interacts with the remote environment. Usually, information about the remote environment is displayed to the operator. In life-threatening and delicate situations such as disaster response, it is imperative to provide the best operator interface for controlling the remote device to optimize safety and operator performance. This encompasses providing sensor and robot state information as well as using feedback and assistive channels to optimize performance. The particular mode of teleoperation of interest is bilateral teleoperation with haptic feedback, in which an operator  provides commands to the remote robot and receives haptic feedback through the master device (Figure 1). The haptic feedback relayed to the operator is based on the remote device’s status or configuration.

The Remote Device

We used the KUKA youBot and CompactRIO as the major hardware platforms on the remote device. The youBot hardware setup includes an omnidirectional base and a five degree of freedom (DOF) robotic manipulator with a gripper. To obtain information about the remote environment, we mounted a commercial PrimeSense Carmine RGB-Depth (RGB-D) camera on the youBot base using an aluminum framing structure. This device acts as the primary sensor and provides geometric information of a volume encompassing the desired task space. We used an Asus AC router to achieve wireless communication between the remote youBot and the local master console station.

Operator Master Console

At the master console station, the user receives visual feedback in the form of a voxelized 3D map of the remote environment based on the RGB-D depth information (Figure 2). The raw RGB stream is also available. During operation, it is critical to log robot configuration and state. In this implementation, we use the LabVIEW Robotics Module, which records and visually presents the youBot configuration and motor encoder readings to the user in real time.

To control the robot, the operator can use a Logitech USB joystick game pad to send navigation commands to the CompactRIO system to drive the youBot’s omnidirectional base. The teleoperator commands the youBot manipulator through a master haptic device, the SensAble Technologies PHANTOM Omni. This device sends three DOF commands to the CompactRIO to control the end effector location of the youBot and provides three DOF kinesthetic force feedback to the user.

A guidance haptic virtual fixture determines the force feedback presented to the user. This virtual fixture pushes and pulls the operator’s hand to help follow a predefined path. In this case, the path comprises a 90° circular arc and entry and exit locations (Figure 3). The robot precisely follows the user’s commands; therefore, the user must provide an effective trajectory for the robot to successfully complete the valve turn. The guidance fixture encourages the user to maintain such a trajectory and, when the user commands the robot into the virtual fixture, force feedback is provided when he/she begins to deviate from the desired path. This ensures the user-provided trajectory can complete the valve turn. The University of Washington and BluHaptics, Inc. developed this virtual fixture technology.

NI Technologies

Prior to using the CompactRIO real-time controller, we attempted to control the youBot using the youBot ROS package installed on the youBot’s onboard computer. This, along with polling, compressing, and sending information from the RGB-D camera proved to be slow and difficult to customize and debug. Moreover, we needed to deal with inverse kinematics, motor joint readings, and the visualization model of the youBot separately at a low-level and tune them heuristically. Occasionally, the manipulator would jitter but eventually converge to the desired configuration. Deciphering the problem was difficult without logging and graphical representation of the sensor/encoder values.

After experimenting with the NI youBot example project with support from NI engineers, we realized that using the CompactRIO controller would solve many of our problems. The interface to EtherCAT and youBot control and simulation VIs presented a flexible, real-time solution for controlling the robot and monitoring its status. Setting up the control hardware was seamless, and we achieved interfacing with the existing master console software with simple, custom UDP packages. Furthermore, the user interface improved with the graphical presentation of sensor readings and simulation of the robot configuration, which we implemented through network-published shared variables. These NI hardware and software solutions accelerated the design process by providing the stability and low-level control the team desired, while also affording the flexibility to expand and customize with other components at a system level.

NI technology:

  • Provided reliable, real-time control of the youBot manipulator, which eliminated jitter
  • Offered a display with readily available real-time graphical representation of robot joint state, which is useful for debugging
  • Simulated robot configuration for visualization and monitoring purposes
  • Automated start-up seamlessly
  • Expedited the design and construction of a unified teleoperated robotic system

With NI technology, we developed the teleoperation platform with haptic feedback as an important part of the Smart Emergency Response System at the 2014 SmartAmerica Challenge, an endeavor to highlight and showcase cyber-physical systems. We presented and featured this work in Washington DC to promote haptic virtual fixtures and teleoperated robots as a means to enhance the efficacy and safety of disaster response to save more human lives.

Author Information:
Kevin Huang

Author Information:
Howard Jay Chizeck
University of Washington Department of Electrical Engineering

Bookmark and Share

Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community

Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI