Creating a Multitouch Interface Library for LabVIEW

  Print
  |   PDF  |  

"This thousand-fold increase in imaging speed enabled us to interact with the instrument and our samples in real time, rather than the time lapse format of normal atomic force microscopy (AFM) imaging."

- David Carberry, University of Bristol

The Challenge:
Implementing a multitouch control within any NI LabVIEW VI.

The Solution:
Using NI LabVIEW software to create a multitouch user interface with more intuitive and responsive methods for laboratory equipment control.

Author(s):
Loren Picco - University of Bristol
David Carberry - University of Bristol

Multitouch is an input method using two or more fingers on a touchscreen at the same time. The most popular example of multitouch is the iPhone, where, for instance, the operator can use two fingers to manipulate digital photos with stretching, squeezing, and rotation gestures.

We believe that the versatility of this new interface and the opportunities it presents, such as gesture recognition and multiuser participation, make it very attractive and popular for a range of applications. Recently, companies such as Dell, Asus, HP, Microsoft, Sony, Google, Apple, LG, and Palm have produced commercial products featuring multitouch interfaces. With this in mind, there is a rapidly growing list of commercially available multitouch displays and we believe that we could use any of these with a LabVIEW VI running a multitouch library.

Implementing in LabVIEW

Interacting with and controlling the front panel of a VI using multiple simultaneous interactions is a natural progression that could significantly improve the “user friendliness” of LabVIEW code. It could even be used as an additional step when prototyping devices that will ultimately require an equivalent real-world interface, or may even replace many hardware interfaces with software equivalents.

As part of our instrument development process, we developed a LabVIEW library of XControls that allows any multitouch interface, either commercial or homemade, to map its detected interactions to the front panel controls of a LabVIEW VI. The code works across multiple versions of both LabVIEW and Windows because it is not reliant on the Windows 7 multitouch API. We can use the library to interact with the Windows environment by mapping gestures to various functions. We can also control VI front panel objects via multitouch XControl versions of all common components such as tabs, images, sliders, dials, and graphs. In addition to the obvious interactions such as pushing buttons and moving sliders, the user can define unique gestures that the library recognises and interprets to perform more complex user-specific scripts. This tremendous functionality places an unprecedented level of control in the hands of the user and improves the verisimilitude of their experience dramatically.

Examples and Applications

The nanophysics research group in the H. H. Wills Physics Laboratory at the University of Bristol undertakes world-leading research into holographic optical tweezing and video-rate scanning probe microscopy, two disparate techniques relying on very different physical principles. The instruments used in this research, the holographic optical tweezers (HOT) and the high-speed atomic force microscope (HS AFM), interface to multitouch displays using LabVIEW. In both cases, our multitouch interface technology and software had a significant and positive impact on each instrument’s ease-of-use and versatility. We believe that the combination of LabVIEW and a multitouch interface unlocks the full potential of these instruments by offering users more intuitive and responsive control methods. The libraries we developed could enable any other system controlled by LabVIEW to be multitouch compatible with minimal coding changes.

http:/vcm.natinst.com:27110/cms/images/casestudies/decvspsh4550480124437472096.jpgIntegration with Holographic Optical Tweezers

Optical tweezing uses a highly focused laser beam to provide an attractive or repulsive force to physically hold or move microscopic objects. This is a common technique in biological research where cells and bacteria can be held in these optical traps and studied in detail. Our HOT system can create dozens of independent optical traps simultaneously. Controlling the locations of these traps has often been a rate-limiting obstacle. We used LabVIEW and a custom-made 100 x 80 cm multitouch display to overcome this limitation. Several users can now operate the HOT system simultaneously to create, destroy, and translate dozens of optical traps in parallel, an impossible task for a conventional mouse-driven interface. This has greatly simplified the optical trapping of more complex objects and allows multiple users to interact with the sample simultaneously.

http:/vcm.natinst.com:27110/cms/images/casestudies/hfpnaldb2359424590585653570.gifIntegration with the HS AFM

AFM is a technique for studying materials on the atomic scale. It functions by raster scanning a sharp probe across the surface of a material, taking measurements of the sample topography and, potentially, material properties such as the localised electrostatic, magnetic and chemical bonding forces. The serial nature of this data collection typically limits the imaging rate to the order of minutes.

The HS AFM developed by our research group collects images at video rate and beyond. This thousand-fold increase in imaging speed enabled us to interact with the instrument and our samples in real time, rather than the time-lapse format of normal AFM imaging. The functionality of the HS AFM was limited initially because controls were updated one scan parameter at a time in the same manner as a standard AFM. To take advantage of the capabilities and responsiveness of the HS AFM, we needed a user interface that allows the operator to interact with multiple controls simultaneously. These movements from the operator are communicated to the HS AFM via NI data acquisition cards. The result is much closer to how a conventional optical, or even scanning electron microscope, can be operated, enabling even nonskilled operators to control the instrument effectively.

It is now extremely simple to create multitouch instruments with LabVIEW. By simply replacing each of the standard controls with multitouch XControls and adding subVIs to obtain multitouch coordinates from any multitouch-capable interface, you can make any front panel multitouch operable. The XControls also retain their keyboard and mouse functionality, so users can compare multitouch technology against traditional interfaces.

Author Information:
Loren Picco
University of Bristol
United Kingdom
loren.picco@bristol.ac.uk

Bookmark and Share


Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community


Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI