Using LabVIEW to Prototype and Validate LED Glasses That Assist the Visually Impaired

 Read in  |   Print

"I have been an avid LabVIEW developer for more than 10 years, and have found that no other ADE offers such fast and flexible software development and debugging. Furthermore, the prewritten vision functions were easy to use and programmatically efficient, which was imperative for our requirements."

- Stephen Hicks, Oxford University

The Challenge:
Improving the quality of life and promoting the independence of people with serious visual impairments.

The Solution:
Using NI LabVIEW software, the NI Vision Development Module, and the NI USB-8451 interface to prototype and validate innovative, technology-based techniques that support the sight of visually impaired people.

Author(s):
Stephen Hicks - Oxford University
Luis Moreno - Oxford University

What Does the Clinical Term “Blindness” Really Mean?

It is a common misconception that blindness refers to the complete inability to see. The world health organisation (WHO) defines blindness as severe sight loss, preventing someone from discerning how many fingers are held up at a distance of 3m, even with the use of glasses or lenses. Therefore, someone who is registered as blind may still have some degree of residual vision and most can detect changes in contrast.

Promoting Sight for the Visually Impaired?

Our team of scientists at The University of Oxford`s Department of Clinical Neurosciences, are developing innovative visual prosthetics, which are electronic aids to support sight for visually impaired people.

We are currently performing a trial of novel techniques that use the individual's ability to sense changes in contrast. We acquire video feeds from head-mounted cameras, and process the image data to detect nearby objects of interest such as people, sign posts, or obstacles to navigate. The detected objects are simplified and displayed back to the user via banks of LEDs attached to a head mounted display. Using a small number of LEDs we can indicate the position and class of the object in the immediate vicinity of the person wearing the device.

Ultimately, we hope to build this technology into a pair of electronic glasses, which we affectionately named Smart Specs. These glasses will give visually impaired individuals more independence by helping them identify nearby objects and navigate their surroundings. When put into series production, Smart Specs will cost about the same as modern smartphones, a much less expensive option than fully training a guide dog.

Creating Prosthetic Simulations to Trial our Techniques

We began by simulating the experience of a retinal prosthetic to explore ways to improve the degree of useful information in the low-resolution implanted displays. We developed the simulation software using LabVIEW and the NI Vision Development Module. The module gave us ready-to-run vision analysis functions and drivers for acquiring, displaying, and logging images from a multitude of camera types, so we could quickly acquire raw image data with little development effort. We published the method and results (van Rheede, Kennard and Hicks. Journal of Vision 2010).

This first study suggested ways to use computer vision to simplify the important elements of a video stream and produce a bright, low-resolution display that may be useful to people with only the smallest amount of light perception. We then began the present study based entirely in LabVIEW, NI-IMAQ, and Vision.

We used the following stages of development to create our system:

  1. Simulate registered blindness.
  2. Develop real-time image enhancement such as edge detection and contrast enhancement.
  3. Develop real-time object detection and explore ways to present a simplified and bright video output suitable for a person with a serious visual impairment.
  4. Develop a fast face-detection algorithm to connect to the simplified video output.
  5. Develop real-time, orientation-independent text reading algorithms.

We ran a proof-of-principle study using the above techniques with healthy controls (under simulated blindness conditions) and with a registered blind man. Both could readily detect and identify previously unseen objects in the environment with our system.

Using functions provided by the NI Vision Development Module, we carried out a variety of inline processing algorithms on the acquired images such as sub sampling and detail reduction via Gaussian blurring. We used several prewritten analysis functions to detect objects of interest using pattern matching and optical character recognition. But by no means were we limited to the functionality provided by the module. For example, we used functions in the colour comparison palette to create face detection algorithms.

The detected objects were initially presented back to the test subject via a commercial head mounted display (HMD), but we quickly realised we could use an improved custom-made, low-resolution display that incorporates banks of serial interface LEDs. To integrate our custom HMD into the simulation system, we chose the NI USB-8451 I2C/SPI interface. With this device we rapidly produced a bright visual display from our object recognition software. We can address all 128 LEDs in the array at a much faster rate than human perceptual vision.

To further augment our object detection algorithms, we use a 3D gyroscopic sensor to stabilise the acquired images. The integration of the gyroscope was again handled by the USB-8451, which we used to pull data at high-speeds from the gyroscope over I2C.

Benefits of the National Instruments Solution

By using the USB-8451 interface to simultaneously acquire data from the gyroscope (I2C) and control the LEDs (SPI), we minimised our hardware requirements. This both simplified system development and saved us money. We considered alternative serial interface devices from other vendors, but the easy integration of the USB-8451 interface with our software steered us toward NI. Also, as typical of NI hardware, the USB-8451 drivers installed incredibly useful example code that accelerated our software development.

As for the application development environment (ADE), we did not consider using anything but LabVIEW for creating our simulation system software. I have been an avid LabVIEW developer for more than 10 years, and have found that no other ADE offers such fast and flexible software development and debugging. Furthermore, the prewritten vision functions were easy to use and programmatically efficient, which was imperative for our requirements.

A Vision for the Future

There are endless possibilities for future iterations of this technology. We could use coloured LEDs to feed different information to the wearer so they can differentiate between important objects, such as people and road signs. We could also establish the proximity of detected objects by controlling the brightness of the LED array.

We believe that we could further improve our optical character recognition routines, enabling the technology to distinguish newspaper headlines from a video image before reading them back to the wearer through integrated earphones. Similarly, we could implement barcode identification algorithms, which already exist as part of the NI Vision Development Module, to identify products and download prices that could be read back to the wearer.

Conclusion

We are now starting our first full clinical trial into this new method. Although still in the early stages of development, our innovative techniques stand to revolutionise the way we support sight in the visually-impaired.

As previously mentioned, we have grand plans for future iterations of the technology. By placing LabVIEW at the heart of our simulation system and utilising maintainable software architectures, the process of scaling our system to integrate future innovations will be kept simple and cost-effective.

Links

http://www.neuroscience.ox.ac.uk/directory/stephen-l-hicks

http://www.ox.ac.uk/media/science_blog/110705.html

Author Information:
Stephen Hicks
Oxford University
Department of Clinical Neuroscience, Level 6 West Wing, John Radcliffe Hospital
Oxford
United Kingdom
stephen.hicks@clneuro.ox.ac.uk

Bookmark and Share


Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community


Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI