Monitoring Activity in the Brain’s Visual Cortex in Virtual Reality Environments

"The biggest advantage of this system compared to other approaches lies in its flexibility; equipment and drivers from various manufacturers can be easily integrated to add functionality and optimize performance. "

- Jasper Poort, Department of Neuroscience, Physiology & Pharmacology, University College London

The Challenge:

Developing a system that can record the activity of large populations of neurons in the visual cortex of a mouse while it actively interacts with its environment to understand how the brain selectively processes behaviourally relevant visual objects.

The Solution:

Using LabVIEW and NI DAQ hardware to create an integrated system with one component that controlled and acquired data from a high-speed two-photon microscope and another component that controlled a virtual reality environment and acquired behavioural data to study how activity in the visual cortex changes during learning of visually guided tasks.

Author(s):

Jasper Poort - Department of Neuroscience, Physiology & Pharmacology, University College London
Adil Khan - Biozentrum, University of Basel
Julija Krupic - University College London
Marius Bauza - University College London
Thomas Mrsic-Flogel - Biozentrum, University of Basel
Sonja Hofer - Biozentrum, University of Basel

 

Vision is the dominant sense in humans. Around 25 percent of the human brain is dedicated to processing visual information. Neurons in these areas need to selectively process what is most relevant, because our brain is not capable of processing all available visual stimuli in the environment. Our research focuses on gaining a better understanding of the basic mechanisms that underlie this selective processing and how it guides our behaviour. This could increase our understanding of neurological and psychiatric disorders, such as visual neglect after stroke, ADHD, and schizophrenia, which involve deficiencies in filtering sensory inputs.

 

We use the visual cortex of mice as a model system for three reasons. First, sophisticated genetic research tools are uniquely available for use in mice. For example, we can label neurons with so-called genetic ‘calcium indicators’ to measure activity levels of individual neurons. Second, in many respects, the organization and function of mouse visual cortex is similar to our own visual cortex. Third, mice can engage in complex visual behaviour that mirrors many features of human behaviour.

 

We carried out the work described here at University College London and at the Biozentrum in Basel, in the lab of Tom Mrsic-Flogel, in close collaboration with the research groups of Georg Keller at the Friedrich Miescher Institute in Basel, and John O’Keefe at UCL. All studies that involved mice were carried out in accordance with institutional animal welfare guidelines and licensed by the UK Home Office.

 

 

Introducing our Experimental Setup

We wanted to study how visual responses of groups of neurons change when animals learn the behavioural importance of visual objects. We therefore used a high-speed two-photon microscope to measure neural activity in the visual cortex of mice.

 

Recent studies demonstrated that responses in visual cortex during active behaviour differ radically from those during passive viewing, which makes it essential to record activity while animals are engaged in interactive behavioural tasks. We therefore used a virtual reality (VR) environment to enable two-photon microscopy and precise control of the visual stimulus while mirroring real world environments in which animals actively interact with the world. This approach increases the complexity of the experiments because it requires us to monitor multiple behavioural variables and use these to update the visual stimulation. It was thus vital to build a system with fully integrated data acquisition and VR (Figure 1). LabVIEW software inherently integrates hardware, software, acquisition, and control, which makes it a practical choice.

 

 

High-Speed Two-Photon Calcium Imaging Microscope

In two-photon microscopy, a powerful laser excites fluorescent molecules in neurons that are labelled with calcium indicators. When these molecules are excited, they emit light that sensitive detectors (photomultiplier tubes) can record. The amount of emitted light reflects how strongly a neuron is activated.

 

With the help of Georg Keller, we built a custom microscope. This meant we could create custom software for the setup and integrate the virtual reality environment. We also had the flexibility to change components to improve the performance of the microscope.

 

Figure 3 illustrates the setup. A key characteristic was that we used a resonant scanner to increase the speed at which the laser beam moves across the brain. The beam is moved with two mirrors: a galvanometric scanner that moves stepwise and a resonant scanner that resonates at a high frequency. Increasing the speed of scanning both improves the temporal resolution of recorded single neuron responses and the number of cells that we can record from, so that we can better study coordinated cell activity. Another advantage of fast acquisition rates (in our setup 32 Hz) is the improved correction of brain motion, which is especially important when imaging behaving animals.


Emitted light, reflecting the mouse’s neural activity, travels back through the microscope objective before being directed through red and green filters to the photomultiplier tubes, which detect the amount of light. Detected light signals are then amplified and fed into the NI PXIe-7965R FlexRIO FPGA module through the NI 5761 front end adapter module. An optical sensor constantly measures the speed with which the mouse runs on the circular treadmill, which controls the position in the VR environment.

 

The microscope included a number of instruments that the user needed to control. The NI-VISA API controlled some instruments, such as the beam splitter, through a USB connection. Using the NI-DAQmx API, digital and analogue signals provided by the NI DAQ cards controlled other instruments, such as the scanners.

 

The FlexRIO module receives trigger input from the 12 kHz resonant scanner during each scan period, which moves the position of the galvanometric scanner mirror, and allows for fast preprocessing of incoming signals. Data is then down-sampled and written to disk and displayed to the user. With this setup, we typically measure activity from around 75 neurons with a frame rate of 32 Hz. Because there is a trade-off between the frame rate and the area that can be imaged, we can also image at lower frame rates to increase the area of the brain and the number of neurons imaged.

 

We developed the graphical user interface of the microscope with LabVIEW so users can control the data acquisition and visualize incoming data (Figure 4). One advantage of using LabVIEW is that we can easily expand the user interface to add functionality. For example, we added the ability to follow the activity of the same population of neurons across many days by overlaying previously recorded data from the same cortical site.

 

Inherent multithreading capabilities in LabVIEW made it straightforward to integrate the third-party drivers for different instruments and to have these run in parallel within LabVIEW.

 

Mouse Behaviour in Virtual Reality

We used the Unity game engine to create VR environments and used LabVIEW to control movement through the environments. To read the speed of the animal running on the treadmill we repurposed the optical sensor from a gaming mouse. We monitored the sensor in LabVIEW using NI-VISA. We used this signal to update the position of the animal in the VR by sending the data from LabVIEW to Unity through a UDP port.

 

As the animal navigated through the environment, we trained the mouse to discriminate between different visual stimuli. The mouse indicated his choice by licking the spout, which we detected using a piezo disc sensor. If the mouse made a correct choice, a milk reward was triggered and delivered through the licking spout. We managed lick detection, reward triggering, and milk valve control with LabVIEW and NI DAQ hardware.

 

Figure 5 shows the user interface for controlling the VR. We used this interface to teach mice behavioural tasks by having the user control task parameters online, such as the length of the virtual corridors and the locations at which mice received rewards, dependent on the performance of the mouse. Also we could easily change the interface controlling the VR to add different controls for different versions of the behavioural task. Using another interface (Figure 6), the user could select which analogue signals and USB cameras to record from and monitor incoming behavioural data.

 

 

System Benefits

We used LabVIEW and NI DAQ hardware to control and acquire data from a high-speed, two-photon microscope combined with a VR environment. The biggest advantage of this system compared to other approaches lies in its flexibility; equipment and drivers from various manufacturers can be easily integrated to add functionality and optimize performance. Similarly, we can easily make modifications to the software to facilitate our experimental procedures. With this setup, we can now study the activity of large populations of neurons in animals that are engaged in a variety of behaviours, with high temporal resolution (Figure 7). We hope that this approach not only increases our understanding of how the brain selectively processes objects important for behaviour, but also helps in future work to understand brain disorders with deficiencies in the filtering of sensory inputs.

 

 

Author Information:

Jasper Poort
Department of Neuroscience, Physiology & Pharmacology, University College London
j.poort@ucl.ac.uk

Figure 1. Two-photon imaging microscope combined with virtual reality environment.
Figure 2. Custom built two-photon imaging microscope.
Figure 3. Overview of experimental setup combining two-photon imaging with virtual reality monitoring. We used three computers for experimental control and monitoring (coloured arrows indicate the flow of data and commands to/from each computer).
Figure 4. The two-photon microscope user interface displays the current image of neural activity (left) and a previously recorded image from the same cortical location (right). The program also allows the user to compare images in a pop-up window, by overlaying the two (red shows current activity, green shows previous activity, yellow indicates overlap).
Figure 5. Virtual reality user interface.
Figure 6. Behaviour data acquisition interface. Displays selected incoming data (mouse licking, output to reward valve, eye camera, digital inputs).
Figure 7. Example recordings in primary visual cortex. Left, recordings from same cortical site on different days. Right, activity traces of four example cells that were imaged while the mouse was running through the virtual reality environment.