Virginia Tech Uses Virtual Instrumentation to Develop Autonomous Vehicles to Compete in the DARPA Grand Challenge
"Participating in the DARPA Grand Challenge gave the VTGC team a greater understanding of autonomous navigation and a solid base for further research in unmanned ground vehicles."
- Brett Leedy,
Virginia Polytechnic Institute and State University
Creating a fully autonomous vehicle capable of navigating complex desert terrain at high speeds to compete for $2 million in the DARPA Grand Challenge.
Using an advanced NI PXI sensor suite and National Instruments LabVIEW to perform GPS navigation, obstacle avoidance, and road following.
Brett Leedy - Virginia Polytechnic Institute and State University
The 2005 DARPA Grand Challenge was a 132-mile autonomous ground vehicle race through the Mojave Desert. The annual race, sponsored by the Defense Advanced Research Projects Agency (DARPA) of the U.S. Department of Defense, was initiated in 2004 to accelerate research and development in autonomous ground vehicles that can help save lives on the battlefield. At Virginia Tech, we produced two off-road autonomous vehicles to compete for the $2 million prize. Of 195 original teams, both Virginia Tech vehicles were selected for the national qualifying event and went on to the main grand challenge on October 8, 2005. Although built on two similar base vehicles, each was designed to use a unique combination of sensors and intelligence to complete the course. Three National Instruments PXI controllers provided the intelligence for each vehicle, performing vision, path-planning, and motion control during the race. A team of undergraduate students wrote the code that powered each of these vehicles in National Instruments LabVIEW.
The Virginia Tech Grand Challenge (VTGC) base vehicles were Ingersoll-Rand Club Car XRT 1500 utility vehicles. This base platform may seem like an unlikely choice for a desert race due to its diminutive size, but the XRT 1500, with its exceptional agility and an extremely small turning radius of only 11.5 feet, has proven to be a tough, capable off-road vehicle. It also provides a top speed of 25 miles per hour and a minimum ground clearance of 6.5 inches under the rear differential. The stock vehicle weight is 1,250 pounds, with a 1,000-pound payload capacity.
To enable full computer control of each of the VTGC vehicles, the throttle, brake, and steering actuation systems were converted to drive-by-wire. Each operator control was replaced with an electric motor. In place of the steering column is a one-half horsepower electric gear motor, and a hydraulic pressure actuator replaced the brake master cylinder. Another electric motor controls the throttle. All of these actuators, along with their corresponding feedback, run through a National Instruments PXI-7344 four-axis motion control board This PXI motion control system is just one of three computers that guide the vehicle.
Both VTGC vehicles used stereo vision processing to detect roads and to navigate between the given waypoints on the DARPA Grand Challenge course. A dual-lens IEEE 1394 camera acquired images of the scene in front of the vehicle at any given time. Due to the complexity of road detection and the need for rugged, real-time, high-performance computing, the team used a National Instruments PXI-8187 2.5 GHz Pentium 4-M embedded controller to read these images. NI LabVIEW Vision Development Module image acquisition tools recorded salient road features and earmarked them for stereo processing. The stereo processing algorithm compared the side-by-side images captured by the dual-lens camera to generate three-dimensional models of road points relative to the vehicle – an operation akin to depth perception in humans. Once the road was recognized and located relative to the vehicle, the center points were passed on to the vehicle’s path-planning computer.
All navigation code on the VTGC vehicles was programmed in LabVIEW and compiled to a Windows executable file for competition. The DEZ algorithm blended the three main behaviors (obstacle avoidance, waypoint navigation, and road following) in a hierarchical decision structure in which one behavior took priority over the others. Each decision was based on the current vehicle sensor state. As the vehicle maneuvered the course, it always attempted to follow a given path – in the case of the DARPA Grand Challenge, a series of global waypoints set at the beginning of the competition. In the most simple scenario, where the vehicle did not “see” any roads or obstacles, the navigation software simply drove toward the next waypoint. If the vision system recognized a road that went in the same direction as the next waypoint, then the vehicle followed that road. If an obstacle was detected in the vehicle’s dynamic “avoidance zone,” then all other behaviors were ignored and the vehicle steered to clear the obstacle. This behavior pattern ensured that the vehicle would not collide with any detectable objects.
At the DARPA Grand Challenge qualifying and main events, the VTGC vehicles demonstrated the capability to navigate the course. However, the vehicles were unable to finish the 132-mile course at the main event – not due to navigational inability, but to the mechanical failure of an internal combustion engine on both vehicles. Had the base platforms not failed, the VTGC team is confident that the sensors and navigation systems, supported primarily by National Instruments products, would have allowed both vehicles to finish the race in just under the 10-hour time limit. Participating in the DARPA Grand Challenge gave the VTGC team a greater understanding of autonomous navigation and a solid base for further research in unmanned ground vehicles.
Bookmark and Share
Explore the NI Developer Community
Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.
Who is National Instruments?
National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.