Building a Semiautonomous Vehicle Driven by the Visually Impaired with LabVIEW and CompactRIO
Blind 15-year-old, Ishaan Rostogi (pictured in drivers seat), drives the world's first blind driver vehicle, developed by Virginia Tech using NI technologies.
"With limited funding and development time, NI products played a vital role in the project success by providing an easy-to-use and cost-effective prototyping platform."
- Dr. Dennis Hong,
Virginia Polytechnic Institute and State University
Developing a semiautonomous vehicle that allows a blind driver to successfully navigate, control speed, and avoid collision through a secure driving course.
Using NI CompactRIO and LabVIEW software to build and program the world's first functional prototype of a blind driver vehicle.
Dr. Dennis Hong - Virginia Polytechnic Institute and State University
Greg Jannaman - Virginia Polytechnic Institute and State University
Kimberly Wenger - Virginia Polytechnic Institute and State University
In an effort to promote the often underestimated capabilities of the blind and to inspire innovation in the development of blind access technologies, the National Federation of the Blind proposed a challenge to design a system capable of providing the blind with an experience never thought to be possible: the ability to drive. The Robotics and Mechanisms Laboratory (RoMeLa) at Virginia Tech has been the only organization to accept the challenge. Reestablished in 2008 as a senior design team and undergraduate research project within the Department of Mechanical Engineering, the Virginia Tech Blind Driver Challenge (BDC) defined the initial goals for the world’s first working prototype of a blind driver vehicle.
In only two semesters and with nine undergraduate students and $3,000 USD in seed funding, a blind driver would be expected to safely perform the three fundamental driving tasks: navigate through a curved driving course defined by a single lane of traffic cones, regulate speed within a predefined limit, and exhibit sufficient emergency-stop capability to avoid colliding with an obstacle.
Our Prototyping Platform
NI products have been used as the singular hardware and software interface for the blind driver system since the project’s inception. We chose NI products because we needed a cost-effective prototyping platform, short data acquisition and processing time to minimize lag in time-critical driving environments, compatibility with numerous sensors and devices, power and reliability in demanding testing conditions, an intuitive programming interface, modularity, size, weight, and capacity for hardware expansion during future development. Researchers examined RoMeLa’s long history of success in using NI products in a variety of applications, from humanoid soccer-playing robots to fully autonomous vehicles. These applications, in addition to the blind driver system, serve as a testament to the versatility and ideal functionality of NI hardware and software as a prototyping platform for robotics applications.
The current blind driver system consists of various sensors and novel nonvisual driver interfaces attached as a modular system to a modified dune buggy. We use a Hokuyo UTM-30LX single-plane laser rangefinder (LRF) for environmental perception to scan the driving environment for cones and other obstacles and feed that information to an onboard CompactRIO real-time controller and its real-time field-programmable gate array (FPGA) processing targets. Conveniently, existing NI device drivers support Hokuyo LRF products because NI engineers provided a custom driver before the UTM-30LX was released to the general public.
A laptop running LabVIEW software provides temporary USB hosting capability to the CompactRIO controller because the 30LX has a USB-only interface, unlike most previous models with the option of RS232. We conducted extensive research to provide the real-time controller with USB hosting capability and bypass the laptop using a third-party conversion chip; however, the Ethernet communication between the CompactRIO controller and laptop was sufficient for the current system. The laptop also allows a sighted passenger to passively monitor the operation of all hardware and software and easily modify any heuristic-based programming for quick calibration during field testing.
Additional sensors gather important information regarding the state of the vehicle, such as speed from a Hall effect sensor and steering angle from a string potentiometer. We acquire data from these sensors and process it directly using the high-speed FPGA on the CompactRIO real-time controller.
Nonvisual Driver Interfaces
After collecting an image of the driving environment using the various sensors, we process the information and transmit it to the driver through nonvisual cues. The ultimate goal when developing a nonvisual driver interface (NVDI) is to effectively and efficiently provide information to a driver to maximize situational awareness and allow the driver to make quick and precise driving decisions. The array of NVDIs on the first iteration of the vehicle is a combination of informational and instructional cues for safety and redundancy.
For speed regulation, the driver can operate at a comfortable speed until reaching a maximum speed limit, at which point a vibrotactile vest on the seat belt informs the driver what degree of braking is necessary to return to a safe operating speed. If the vehicle detects an unavoidable collision with an obstacle, the vest cues the driver to stop the vehicle immediately.
During initial vest testing, we used a custom circuit board to control the motors. RS232 signals were sent from the LabVIEW software on a PC to a PIC microcontroller, which controlled a large bank of transistors and relays to actuate motors in the vest at various intensities. After acquiring CompactRIO, we no longer needed the circuit board because of the NI 9485 8-channel relay module. Bypassing the circuit board reduced the bulk and potential complications from additional hardware, which greatly simplified the underlying software, and significantly reduced the time between obstacle detection and full motor vibration, which is critical for drivers in an emergency situation.
For steering guidance, a potential field algorithm provides the path generation. After calculating a path, the system instructs the driver where to steer to stay in the lane and avoid obstacles. The driver is told how many “clicks” to turn the steering wheel via a pair of headphones and LabVIEW text-to-speech software. A mechanism attached to the steering column clicks every five degrees to provide precise audible feedback.
Additionally, we developed a prototype for a tactile map, which is conceptually similar to a high-resolution grid of regenerative braille. The map places an image of the surrounding environment literally in the hands of the driver. Similar to the tiny holes on an air hockey table, a physical map is generated by passing compressed air through small pixels to depict the surrounding obstacles detected by the laser range finder. This device, appropriately named AirPix, allows the driver to “see” the surroundings and navigate safely through them. The audio and vibrotactile NVDIs are still necessary for redundancy, but using the driver’s high-bandwidth sense of tactation through this tactile map technology makes data pathways available for other driving uses, such as listening and interacting with a GPS through voice-recognition software for higher-level path planning.
Benefits of NI Hardware and Software
Using NI hardware and software from NI, we created the world’s first working prototype of a blind driver vehicle. With limited funding and development time, NI products played a vital role in the success of the project by providing an easy-to-use and cost-effective prototyping platform. The intuitive graphical programming interface of LabVIEW made it easy for a team of undergraduate mechanical engineering students to quickly and efficiently create custom embedded software without the need for expertise in specific text-based programming language.
The modular design and large capacity of CompactRIO for additional I/O modules combined with the extensive compatibility of LabVIEW with external devices ensured that future expansion and improvements to the system would be possible with minimal effort or cost. The real-time and FPGA processing targets provided the high-speed data acquisition and processing power necessary to gather essential data from the time-critical driving environment. In addition to the internal capabilities, the convenient size and low weight of CompactRIO were ideal for the limited space and payload capacity on the current blind driver buggy.
Throughout the highly iterative prototyping process, NI modular products made it easy to adapt to unique and demanding testing environments, changing vehicle platforms, and shifting project objectives. With versatile NI hardware and software, the Virginia Tech Blind Driver Challenge continues to “invent the future” in blind access technology.
Spin-Off Technologies and Future Plans
In the months following the 2008 to 2009 academic year, the Virginia Tech Blind Driver Challenge provided more than 30 blind and visually impaired people of all ages from around the country with the opportunity to drive a vehicle. Whether it was their first time behind the wheel or a long-awaited reunion with an automobile, their reactions were overwhelmingly positive and filled with hope. Resulting national and international media coverage is raising tremendous awareness to the capabilities of the blind as well as generating interest in collaboration for the research and development of novel blind access technologies in various applications.
The numerous potential spin-off technologies were a major emphasis throughout the design process. As these devices are proven to be sufficient in allowing a blind person to drive a vehicle, we can imagine the benefits for drivers who have bad vision, talking or sending text messages on a cell phone, drowsy, or otherwise distracted on the road. We could create early warning devices and collision mitigation systems for all driving environments, especially in bad weather or low-visibility conditions.
Other than automotive applications, there is also potential for advances with haptic human interface devices, especially for blind pedestrians. The nonvisual interfaces could easily be deployed in aircraft cockpits in which the current technology state relies heavily on the pilot's visual capabilities. Sending the high-bandwidth information from the highly saturated visual environment across the other senses will greatly increase a pilot's situational awareness, which is a critical aspect when operating any vehicle.
Even though we may not see blind drivers on the road for many years, the potential spin-off technologies are suitable for immediate use in countless applications.
Explore the NI Developer Community
Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.
Who is National Instruments?
National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.