Using LabVIEW to Develop a 3D Display System

 Read in  |   Print

"We developed a gesture recognition system using a USB camera, PXI hardware, LabVIEW, and the NI Vision Assistant."

- Yang Hao, Tsinghua University

The Challenge:
Developing an interactive 360° stereoscopic 3D film system that displays both visual models and real objects, does not require glasses, and is affordable.

The Solution:
Developing a 3D display system that consists of three parts: acquisition, processing, and 3D display.

Author(s):
Yang Hao - Tsinghua University
Xu Mohan - Tsinghua University
gao Yongfeng - Tsinghua University
Yang Hao - Tsinghua University
Zhang Qian - Tsinghua University

Figure 1. 3D Display System

3D film is a motion picture that enhances the optical illusion of depth perception, but we must enjoy it with glasses and limited viewing angles in the cinema. Most of the stereoscopic display systems or 3D TVs are so expensive that they cannot perform interactive functions. In this sense, building an interactive 360° stereoscopic display system that does not require glasses, is affordable, and has the functionality to display 3D images from both visual models and real objects, is a big challenge. 

Our system consists of three types of acquisition capabilities including building visual models, non-real-time acquisition of real objects by the NI 1764 Smart Camera and turntable, and real-time image acquisition of real objects by four USB cameras.

Virtual Model 

To acquire a virtual model, we use NI LabVIEW software to read the virtual 3D model document and set parameters, and then we combine four images of the model taken from different directions into one image, as shown in Figure 2.2. The newly created image is projected onto an inverted pyramid optical structure as shown in Figure 2.1 and the schematic in Figure 2.3.

Figure 2. Imaging Principal

Figure 3. Display of Virtual Model

Figure 4. Program of 3D Display of the Virtual Model

Gesture Recognition

We developed a gesture recognition system using a USB camera, PXI hardware, LabVIEW, and the NI Vision Assistant. We capture the image sequences of a hand and judge its movement. We can use this information to control the 3D display. For example, sliding a hand can control the icon rotating and two hands can control the icon zooming, as shown in Figure 6.

Figure 5. Gesture Recognition 

Figure 6. Finding the Object in the Image Using the Vision Assistant

Figure 7. Program for Gesture Recognition 

Non-Real-Time Acquisition

The non-real-time acquisition system depends on an NI 1764 Smart Camera and a turntable controlled by PXI. By putting an object on the turntable and capturing images while it’s rotating, we can get information about the object from all directions, and then choose the four images we want to use to show the 3D image.

Figure 8. Hardware for the Non-Real-Time Acquisition System

Figure 9. Image Acquisition

Figure 10. 3D Display of Real Object

Figure 11. Program for Non-Real-Time Acquisition System

 

Real-Time Acquisition

We place four USB cameras around the object to acquire real-time images from four different directions using PXI and NI Image Acquisition software. We can then process these images and show the 3D image by the optical structure.

Figure 12. Interface for Real-Time- Acquisition System

Figure 13. Hardware for Real-Time Acquisition System

Figure 14. Program for Real-Time Acquisition System

Author Information:
Yang Hao
Tsinghua University
Beijing
China
xu-mh09@mails.tsinghua.edu.cn

Bookmark and Share


Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community


Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI