Scoring Object Recognition in Rats with LabVIEW and NI Vision to Eliminate Errors and Subjectivity

  Print Print

"Programming in the highly flexible LabVIEW environment gave VI Technologies the ability to enhance the application with specific features and improvements to aid in automatic scoring."

- Jeffrey Habets, VI Technologies

The Challenge:
Scoring object recognition in rats more objectively and with fewer errors than traditional methods.

The Solution:
Developing an application with NI LabVIEW and the NI Vision Development Module to perform automated scoring that offers accurate, reliable results.

Jeffrey Habets - VI Technologies
Huub Hamers - Maastricht University

NI Silver Alliance Partner VI Technologies in Weert, Netherlands, provides technical automation services and specializes in object-oriented design and implementation of automated test and measurement systems with LabVIEW and NI TestStand. By developing a user-friendly LabVIEW application, VI Technologies implemented the following solution, which had been prototyped by Maastricht University and the University of Sydney.

Object Recognition Task Background

Object recognition task (ORT) is a popular behavioral test used in neuroscience research to examine animal memory performance. ORT was first described by Ennaceur et al. (1988) and is a preferred test method among researchers studying animal behavior and the effects of different chemicals and medications on memory. However, a major drawback of the ORT is that scoring is mostly done by hand, resulting in a degree of human error and subjectivity.

Researchers at Maastricht University and the University of Sydney developed an automatic scoring algorithm that reliably tracks the nose of rats for an experiment in observing memory recall. The research team designed a closed-object arena and placed a number of different objects in it.

Next, they placed a rat in the object arena for three minutes. As the rat sniffs the objects, lab personnel measure the amount of time the rat’s nose is less than 2 cm from each object. After the rat is removed from the arena, lab personnel replace the objects with different items. An hour later, the rat is placed back in the arena to repeat the exercise. If the rat remembers specific features of the objects, it will spend more time sniffing the new objects. If the repeated test is done 24 hours after the original test, the rat will have no recollection of the location or features of the objects.

Measuring the time that the rat’s nose is within a certain distance from the objects allows a comparative measurement of memory recall. In this way, researchers can compare test scores of rats given medication to those who did not receive it, which aids in judging the effectiveness of drugs developed to reduce memory loss with diseases such as Alzheimer’s.

Tracking the Rat’s Nose

To determine the algorithm to track the rat’s nose, the team had to compute the rat’s center of gravity (COG). Before this could be accomplished, they had to segment the tail from the body of the rat by marking the base of the tail of the white rat with a black marker. This allowed the team to use a simple particle analysis to isolate the body from the tail prior to computing the position of the rat’s COG and nose. Then, by iteratively computing the line of maximum intercept crossing the center of mass, they could determine the exact position of the rat’s nose.

Automating the ORT

Based on the algorithm developed by the universities, VI Technologies and researchers at Maastricht University developed an imaging system that automates the task of judging memory recall in rats. First, the team placed a monochrome analog camera approximately 250 cm from the field of view of the arena. Using an NI PCI-1411 analog frame grabber, images from the camera were then digitized into a PC. NI LabVIEW and the NI Vision Development Module acquired and processed the images.

Prior to starting a trial, the researcher can adjust brightness, contrast, and thresholding levels, and define ROIs for the arena and objects in the arena. As images are captured, LabVIEW runs various algorithms to compute the position of the nose of the rat and the duration of the proximity of the rat’s nose to the object. The arena mask is first applied to localize the region of interest. A reference image of the arena, taken prior to introducing the rat, is subtracted to eliminate the background and objects so effectively that the rest of the algorithm only operates on the rat’s image. When brown rats are used instead of white ones, the solution offers an optional contrast inversion algorithm that allows a brown rat to be tracked. Then the image is thresholded to simplify further processing.

After the animal is introduced into the arena, the NI hardware and software combination automatically tracks the position of its nose and displays scores for nose proximity and the number of times a specific object was visited through the graphical user interface.

By making good use of the capabilities of modern multicore systems as well as LabVIEW and the Vision Development Module, the team achieved the desired specifications for acquisition rates and ran a truly automated scoring system free of the errors common in manual scoring.

In addition, programming in the highly flexible LabVIEW environment gave VI Technologies the ability to enhance the application with specific features and improvements to aid in automatic scoring. These advanced features include session/trial management, saving and loading different arena configurations and objects ROI, and recording and playback of test footage.

Author Information:
Jeffrey Habets
VI Technologies

Bookmark and Share

Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community

Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI