Team Victor Tango’s Odin: Autonomous Driving Using NI LabVIEW in the DARPA Urban Challenge
Odin drives autonomously in the DARPA Urban Challenge under the control of software based on LabVIEW.
"The intuitive graphical interface of LabVIEW and ready-to-go drivers for sensors such as LIDAR made it easy for a team of undergraduate mechanical engineering students to quickly and efficiently create custom embedded software."
- Patrick Currier,
Virginia Polytechnic Institute and State University
Developing an autonomous vehicle to complete the Defense Advanced Research Projects Agency (DARPA) Urban Challenge, an autonomous ground vehicle race through an urban environment.
Using the NI LabVIEW graphical programming environment and National Instruments hardware to enable rapid development, testing, and prototyping to successfully complete the challenge (placed third out of 89 competitors and won a $500,000 USD prize).
Patrick Currier - Virginia Polytechnic Institute and State University
Jesse Hurdus - TORC Technologies, LLC , Virginia Polytechnic Institute and State University
Dr. Charles Reinholtz - Embry Riddle Aeronautical University
Dr. Al Wicks - Virginia Polytechnic Institute and State University
The DARPA Urban Challenge required a ground vehicle to autonomously navigate through an urban environment. To complete the course, our fully autonomous vehicle had to traverse 60 miles in less than six hours, while navigating traffic through roads, intersections, and parking lots. At the start of the race, a mission file specified checkpoints on a road network map to be visited in a specific order.
To reach the checkpoints as fast as possible, the vehicle had to choose roads by considering speed limits, possible road blockages, and traffic conditions. While driving, the vehicle had to obey the rules of the road and properly interact with human-driven and autonomous traffic. Rules also required that the vehicle to stay in its lane and react safely to other vehicles by matching speeds or passing. Additionally, it needed to obey right-of-way rules at intersections and drive safely and defensively, avoiding both static and dynamic obstacles at speeds of up to 30 mph.
Our team, Team Victor Tango, had only 12 months to develop a vehicle to meet this unprecedented challenge. We divided the problem into four major parts: base platform, perception, planning, and communications.
Each part took advantage of the capabilities of National Instruments hardware and software. NI hardware was instrumental in interfacing with the vehicle’s existing systems and providing interfaces for a human operator. We used the LabVIEW graphical programming environment to develop software, including the communications architecture, the sensor processing and object recognition algorithms, the laser range finder and vision-based road detection, the higher-level driving behaviors, and the low-level vehicle interface.
Odin is a 2005 Ford Escape Hybrid modified for autonomous operation. An NI CompactRIO system interfaces with the Escape’s systems to enable drive-by-wire control of the throttle, steering, shifting, and braking. Our team used LabVIEW and the LabVIEW Control Design and Simulation Module to develop path curvature and speed control systems that we deployed to CompactRIO using the LabVIEW Real-Time and LabVIEW FPGA modules, creating a stand-alone vehicle platform. We used the LabVIEW Touch Panel Module to create a user interface for the NI TPC-2006 touch panel computer, which we iinstalled in the dashboard.
To fulfill the behavioral requirements of the Urban Challenge, Odin needed to be able to localize its position, detect the surrounding road coverage and legal travel lanes, perceive all obstacles in its path, and appropriately classify obstacles as vehicles. A number of sensors enabled Odin to meet these requirements, including three IBEO four-plane laser range finders (LRFs) at bumper level, four SICK LRFs and two computer vision cameras on the roof rack, and a high-accuracy Novatel GPS/IMU system.
For each perception requirement, we used multiple sensors to achieve maximum fidelity and reliability. For flexible sensor fusion, the planning software neglects any raw sensor data and uses a set of sensor-independent perception messages generated by task-specific components. The localization component contains a LabVIEW Kalman filter that tracks vehicle position and orientation. The road detection component uses the NI Vision Development Module to combine camera and LRF data to determine a road coverage map and the position of each lane in nearby segments. The object classification component uses LabVIEW to process IBEO data to detect obstacles and classify them as either static or dynamic; the dynamic obstacle predictor then predicts the paths and actions of other vehicles.
The planning software on Odin uses a hybrid deliberative-reactive model dividing upper-level decisions and lower-level reactions into separate components. These components run concurrently at independent rates, making it possible for the vehicle to react to emergency situations without needing to re-plan an entire route. Splitting the decision making into separate components enables each system to be tested independently and fosters parallel development, which was necessary given the short timeline of the Urban Challenge.
The route planner component uses an A* search algorithm to determine which road segments the vehicle should use to achieve all checkpoints. The driving behaviors component uses a behavior-based LabVIEW state machine architecture responsible for obeying the rules of the road and guiding the vehicle along the planned route. The motion-planning component performs an iterative trajectory search to avoid obstacles and guide the vehicle along the desired route. The system then passes motion profiles to the vehicle interface to be translated into actuator control signals.
We developed our entire communications framework using LabVIEW. We implemented the SAE AS-4 Joint Architecture for Unmanned Systems (JAUS) protocol, enabling automated, dynamic configuration and enhancing the future reusability and commercialization potential of Urban Challenge software. Our team also implemented each software module as a JAUS component with all interactions between modules occurring through this LabVIEW framework. Each software module operates as a stand-alone component that can run asynchronously on either the Windows OS or the Linux® OS. With this communications backbone, interfacing or reusing software modules written in LabVIEW with software modules written in other languages is trivial.
Benefits of LabVIEW
LabVIEW provided a successful programming environment for our team for several reasons. With a team composed mostly of mechanical engineers, LabVIEW enabled the development of advanced, high-level perception and planning algorithms by programmers without computer science backgrounds. Furthermore, easy interaction between LabVIEW and hardware enhanced the ability to implement the time-critical processing crucial for sensor processing and vehicle control.
LabVIEW also provided an intuitive and easy-to-use debugging environment so we could run and monitor source code in real time for easy hardware-in-the-loop debugging. The LabVIEW environment enabled the team to maximize testing time and promoted rapid prototyping and a greater number of design cycles. Given the very short timeline for the Urban Challenge and the unique nature of the problem, these abilities played a critical role in the team’s overall success.
We successfully used LabVIEW and NI hardware to develop an autonomous vehicle capable of completing the Urban Challenge, a never-before-attempted problem in robotics. Odin placed third overall, just minutes behind the leaders
Linux® is the registered trademark of Linus Torvalds in the U.S. and other countries.
Learn More About NI Hardware and Software for Robotics
Explore the NI Developer Community
Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.
Who is National Instruments?
National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.