Designing a Robotic Device to Study Flying Insects Using LabVIEW and CompactRIO

  Read in   |   Print Print

"With the help of a CompactRIO controller and LabVIEW, we investigated how flying insects achieve such outstanding flight control."

- Chauncey Graetzel, Optotune AG

The Challenge:
Developing a flexible, high-bandwidth robotic device to measure and simulate flight patterns in winged insects.

The Solution:
Using NI LabVIEW software and NI CompactRIO hardware to build a fast, modular, and easy-to-use biorobotic platform involving a wide range of industrial protocols and real-time, closed-loop stimulus generation.

Author(s):
Chauncey Graetzel - Optotune AG
Vasco Medici - ETH Zürich
Nicola Rohrseitz - ViSSee Sagl
Daniel A. Schwyn - Imperial College London
Chris Rogers - Tufts University
Holger G. Krapp - Imperial College London
Bradley J. Nelson - ETH Zürich
Steven N. Fry - ETH Zürich

A Flies’ ability to land precisely on the rim of a plate or chase each other at high speeds presents much to learn from their maneuverability. Flies can be used as model systems for neural information processing, aerodynamics, and genetics, and they can employ rapid and precise biological sensors, controllers, and actuators. Such capabilities make them interesting but difficult to study. Measurement and stimulation devices must have high bandwidth, low lag, and flexible interfaces. At the same time, ease of use and modularity are critical for interdisciplinary and collaborative research.

Using a CompactRIO controller and LabVIEW graphical system design software, we investigated how flying insects achieve outstanding flight control. We used digital I/O modules to interface an LED-based visual stimulus arena at temporal and spatial resolutions that allow us to efficiently stimulate the fly’s visual system. Recording the insect’s response requires a fast and flexible acquisition system. LabVIEW provides the speed and modularity needed to record these signals and the ability to use them as real-time feedback for stimulus generation. As a result, we can study the fly as a living sensor embedded in a technical system.

We developed an experiment in which we used the behaviour of a tethered fruit fly to steer an e-puck robot, a miniature mobile robot designed for use in university-based research projects, through an obstacle-filled environment. Feedback from cameras and proximity sensors on the robot determine the visual stimulus shown to the fly, and flight parameters such as wing beat frequency and amplitude control the robot’s movements (Figure 1). The transfer functions between the fly and robot vary, which presents a range of experimental paradigms.

High-Speed Cinema for Flies: Accelerating an LED Arena

The visual stimulus arena includes eight green-scale LED panels that can be addressed via I2C protocol using a custom-made controller. In their original setup, all panels of the flight simulator are controlled using only one bus line. To achieve higher frame rates using several parallel bus lines and to adjust the visual stimulus according to feedback from the fly, we chose to replace the original controller with an NI cRIO-9014 real-time controller and the integrated NI cRIO-9104 reconfigurable embedded chassis.

Fly Cyborg: From Fly to Robot

In the experimental setup (Figure 2), a fruit fly is tethered in the centre of a circular array of LED panels. Although the insect cannot move in space, it can flap its wings and behave in much the same way it would during free flight. A digital wing beat analyzer acquires the current frequency, amplitude, mean position, and phase of the fly’s wing beats. This behavioral state vector is transmitted via user data protocol (UDP) packets to a host computer running LabVIEW where we apply a user-defined transfer function to calculate new wheel speeds for an e-puck robot. These values are sent to the robot via Bluetooth.

From Robot to Fly

While we use the insect’s behavior to steer the robot, and feedback from the robotic device modifies the visual patterns shown to the insect. This feedback is provided by three linear cameras and eight proximity sensors mounted on top of the robot. The 102 pixels of each camera are sampled at 10 Hz and the scalar outputs of the proximity sensors are read out at twice that rate. The host computer receives these signals via Bluetooth and applies a second user-defined transfer function to generate the next frame to be displayed on the LED arena.

Via Ethernet, the host application sends this new pattern to the real-time controller. Then, it is divided into eight-by-eight pixel blocks, each corresponding to one LED panel, and translated into I2C commands. To achieve maximum throughput, these are transferred to the field-programmable gate array (FPGA) through a direct memory access (DMA) first-in, first-out (FIFO) queue. Interrupt vectors guarantee synchronization between command generation on the real-time controller and low-level hardware communication on the FPGA and the FPGA backplane implements the I2C protocol on 12 bus lines, each communicating with five panels. Thus, the environment viewed by the robot determines the visual stimulus for the fly, and the fly’s response alters the robot’s path.

Depending on the depth of the pattern and vertical symmetry, the frame rates of the visual stimulus range from 30 to 400 Hz. The cumulative latency in the control loop is below 50 ms and caused mainly by the Bluetooth transfer of sensory information from the robot to the host machine.

Efficient Design: Flexible Interfaces and Modular Structure

Using LabVIEW and CompactRIO, we can interface a variety of research tools using a range of protocols. This flexibility and the many examples provided by NI and the LabVIEW user community make application design based on LabVIEW an efficient alternative to custom-made controllers in experimental biology.

We designed a user-friendly GUI that gives experimenters the needed control and information thus abstracting the complexity of the code hosted on multiple hardware platforms.(Figure 3). This feature is a great benefit in an interdisciplinary environment in which biologists, mathematicians, physicists, and engineers collaborate closely. In addition, the modularity and portability of LabVIEW code allows it to be reused and shared among laboratories. For example, in a customized version of the solution, patterns can be pregenerated and saved on a USB pen drive, loaded into the RAM of the real-time controller, and streamed to the panels, which permits even higher refresh rates.

A Hybrid, Adaptive Controller

With the high degree of plasticity in parts of the fly’s neural circuitry, the insect can be viewed as an adaptive controller. Using the new biorobotic platform, we evaluated the performance of the controller for a variety of exotic transfer functions that simulate all but the fly’s natural environment, such as gratings moving up or down depending on the position of the robot’s closest obstacle. Surprisingly, the most intuitive transfer functions do not necessarily lead to the best results.

LabVIEW and CompactRIO provided an ideal solution for building a control loop that incorporates a living insect and allows us to perform a variety of experiments. CompactRIO acquires and generates signals for a multitude of industry standards and extends custom-made research tools. In addition, we achieved major efficiency gains with the ability to distribute our application between a PC, the real-time controller, and the FPGA without having to learn several programming and design languages. The range of available add-on products and interfaces also offered great potential for future extensions and adaptations.

Acknowledgements 

We thank Vasco Medici and Nicola Rohrseitz and Gilles Caprari for helping develop the robot controller. We also thank Jean-Christophe Zufferey and Dario Floreano for providing the E-puck robots, Jan Bartussek for helping run experiments, and Mathias Moser for helping with the flight arena. 

References

[1] Reiser MB, Dickinson M. A modular display system for insect behavioral neuroscience. J Neurosc Methods 2008;167:127–139. 

[2] Graetzel CF, Medici V, Rohrseitz N, Nelson BJ, Fry SN. The Cyborg Fly: A biorobotic platform to investigate dynamic coupling effects between a fruit fly and a robot. IROS 2008 Sept;14-19.

Author Information:
Chauncey Graetzel
Optotune AG
chauncey.graetzel@optotune.com

Bookmark and Share


Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community


Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI