Development of a Fully Autonomous Humanoid Robot for Novel Locomotion Research and as the First U.S. Humanoid Entry to RoboCup
"With the LabVIEW Joystick VI it was very simple to implement this concept, which allows for easy debugging of gaits and motion without simultaneously debugging the robot’s behavior."
- Karl Muecke,
Virginia Tech University
Developing a robotic system to act as a research platform for studies on novel locomotion and to serve as the first United States entry into RoboCup, a soccer competition for autonomous robots.
Using NI LabVIEW to interface with third-party hardware to speed up development and testing for novel locomotion studies as well as the LabVIEW Real-Time Module and NI vision software to create artificial intelligence that enables the robot to perform high-level functions, such as playing soccer.
Karl Muecke - Virginia Tech University
The Dynamic Anthropomorphic Robot with Intelligence (DARwIn), a humanoid robot, is a hardware platform used for studying bipedal gaits at the Robotics and Mechanisms Lab (RoMeLa) at Virginia Tech. At RoMeLa, we test ideas and theories for locomotion research on hardware. We decided to use the RoboCup international soccer games as a competitive and realistic arena to test and demonstrate the robustness of DARwIn.
Current technologies for robot programming and control are usually created in C code, have a steep learning curve, and are challenging to interface with changing hardware. At RoMeLa, we used NI technology to accelerate our novel robotic locomotion development and research. The result is a fully autonomous humanoid robot that plays soccer and acts as a research platform for studying novel locomotion. We used the LabVIEW graphical development platform not only to create expandable, adaptable software that eases and accelerates research but also to serve as a brain to enable a robot to perform higher-level tasks, such as playing competitive soccer.
LabVIEW as an Expandable Hardware Interface
Because there are many different types of robotic platforms at RoMeLa, we needed a system that we could easily configure for a variety of hardware setups. Most small-scale robot research uses personal digital assistants (PDAs) to autonomously control the robots. By using the LabVIEW Real-Time Module on a PC104+ computer, we had virtually no overhead and an expandable computer architecture that, unlike a PDA, simultaneously accommodated a range of different sensors – IEEE 1394 cameras, RS485 communication, multiple wireless networks, and more. Adding a new camera or an 802.11 port, adapting drivers, and writing code in C or C++ would have taken much longer than dropping in a LabVIEW VI that took care of everything.
Currently, we use LabVIEW to control the robot’s motion over RS485 and read joint positions on the same serial network from the servo motor built-in potentiometers. While the robot is walking or moving, a rate gyro with acceleration and orientation information communicates with LabVIEW over an RS232 serial connection so that the program modifies the walking gait to effectively balance the robot in real time.
Initially, interfacing with the servo motors and rate gyro was all we needed for the robotic research platform. However, taking part in the July 2007 RoboCup competition required more sophisticated hardware and programming. In addition to the programs that enabled the robot to walk and balance itself, we added software to develop eye, brain, and communication functions. Because the robot had to be completely autonomous and untethered during the RoboCup competition, we used a Web host to control the stop/start signals of the robot. LabVIEW Real-Time runs all of the software on the robot, which reduces overhead and frees up CPU time.
Verifying Locomotion Research
When generating mathematical formulations for robot locomotion, it can be difficult to visualize the results. We used LabVIEW not only for easy deployment of gaits generated in other computing software packages such as Wolfram Mathematica or Microsoft Excel but also to aid in our locomotion research by creating a graphic visualization of the motion of the robot. With LabVIEW 3D picture control, we simulated what the robot looks like when performing a generated gait. This saved development and research time because it can be laborious to set up and test gaits with physical hardware.
When we were ready to test robot locomotion on the physical hardware, we could bypass and emulate the robot’s artificial intelligence on a user-controlled joystick. The user acted as the robot’s eyes and brain and the joystick served as an interface to control the robot’s motions by sending commands such as walk, kick, dive, and so on. With the LabVIEW Joystick VI, we implemented this concept and achieved easy debugging of gaits and motion without simultaneously debugging the robot’s behavior.
Creating Artificial Intelligence
With no prior exposure to any kind of vision processing and in only two hours, one graduate student configured two IEEE 1394 cameras and wrote a VI that identified and physically located the relative position of the orange soccer ball used in RoboCup. Other universities around the world accomplishing the same task spend years with many students working on the code, but with our development efficiency, one student spent one week to create DARwIn’s soccer playing behavior control that qualified DARwIn as the first and only U.S. humanoid robot for RoboCup.
To view a video demonstration, click below:
Explore the NI Developer Community
Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.
Who is National Instruments?
National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.