LabVIEW and NI Hardware Automate Gas Leak Testing in the CERN Supercollider’s ATLAS Experiment

 Read in  |   Print
  |   PDF  |  

"LabVIEW helped us observe the behavior of the PID controller and the time evolution of the parameters."


The Challenge:
Performing automatic gas leak tests at the atmospheric pressure range on 128 gas supply lines and 1,200 resistive plate chamber (RPC) detectors installed in the CERN Large Hadron Collider (LHC) ATLAS experiment.

The Solution:
Implementing a PID controller, which is based on LabVIEW and NI data acquisition hardware, to stabilize gas flow and testing pressure at low levels, making it possible to automatically, directly, and precisely measure gas loss.


About the ATLAS RPC

The ATLAS (A Toroidal LHC Apparatus) experiment is one of the four major collaborative projects at the CERN LHC. It is a very complex detection system that is dedicated to making new discoveries in proton-proton collision physics. The RPC is part of the muon trigger detectors for the barrel region of the muon spectrometer in the ATLAS experiment. This RPC is a unique ionization detector for particle physics because it features high detection efficiency (up to98 percent), very high time resolution (1.5-2 ns), good spatial resolution (~ 1 cm), and proven reliability. These characteristics, combined with the RPC’s industrial-supported production and relatively low cost, make it an ideal instrument for a fast-response application like the muon trigger in large detectors and wide cosmic-ray detector arrays.

The ATLAS RPC consists of the following four major sub-detectors: An inner detector, which incorporates semiconductor pixels and strip detectors plus;  a set of straw tubes, which serves as a sensitive portion of a transition radiation detector that track charged particles; a set of calorimeters that indicate the total energy of hadrons and electromagnetic radiation; and an external precision muon spectrometer, which measures muon momentum by observing track deviation in a toroidal magnetic field.

Importance of Accurate Gas Detection in ATLAS

Since the RPC is a gaseous detector, the quality of the gas mixture and the precise configuration of the gas supply system are for correct functionality, and gas leak announced bythe detectors demands immediate attention. A gas leak is always accompanied by an equivalent mixture ofthat enters into the detector.This air pollutes the gas mixture (tetrafluorethane, isobuthane and sulfur exafluoride) and causes unwanted dispersions of gas in the experimental area. Even if the mixture is neither toxic nor combustible, gas dispersions cause higher managing costs due to the needs of producing and distributing more mixture. However, even after all actions have been taken to limit gas leaks, smallleaks are unavoidable. Because of this, the only practical preventive action we can take is to directly and precisely measure the gas leak level with a reference gas (in our case nitrogen) to make sure the leak does not rise above acceptable limits.

Implementing the Testing System

The testing system consists in a PID process in which the process variable is the chamber pressure read from the incoming gas line, and the output variable is the nitrogen flow setting point. See the Figure 2 for a complete scheme of the leak test measurement system. Figure 3 illustrates the main panel of the VI implemented. The human interface consists of a part (on the left) dedicated to the control of the run: specifications for the sector under test and on the duration of the run can be indicated here.  The right side of the Instrument consists of a two-page tab control containing PID and field and integration controls and indicators, both shown in the Figure 3.

The PID tab contains all information concerning the strict Control Process: user can manage the Set Point pressure (here at 5 hPa) of the process and the three PID gains. Indicators of the Output Variable values are also present here: the main value produced by the PID control is in ml/min of nitrogen flow, then converted into the physical value to be sent to the DAC channel (a voltage in the 0 to 5 V range) and the equivalent in percent of full scale, useful for a quick understanding of the working point of the MFC valve.

The “Field & Integration” tab contains real time information on the data acquired from the field: the testing pressure (i.e. the Process Variable) and the actual flow measured by the MFC, the latter reported in liters per hour too, a preferred unit in the case of very low flow. A waveform chart reporting the time evolution of the two parameters and, in the third area, the “stability flag”: this flag indicates when the PID is stable enough and the integration process can proceed. The tab contains also the flow subject to the integration, that is basically a subset of the actual MFC flow array, extracted when the “stability flag” is TRUE. Note that the timescale reported on the integration chart is the effective relative time scale accumulated during the integration time: its maximum value is considered as the total integration time and is used for comparison to stop the entire VI if required by the user (see the RUN control, “Run End” switch setting).

The results of the integration is indicated in the upper right Cluster in which the total flow (in ml) is calculated together with the integration time and their ratio (giving the average gas loss of the whole testing volume in ml/s). Finally, arithmetic division by the testing volume (in this case 110 liters) is performed in order to obtain the specific gas loss (the mean exchange between gas mixture and air to which every liter of gas contained into the detector undergoes). Typical total integration time used are of half an hour to one hour (in the Figure few seconds are reported for simplicity) gining a total testing time of 35-40 to 70-80 minutes for each gas line.

Saving Data

Data are filed in two different ways: a new binary Stream File is generated for each line under test (one of the 128 gas channels), containing the “field” results (so the actual pressure and flow value measured at the MFC) and the “stability flag” as a binary number (0-1) recorder at a time step of 0.5 seconds. These stream files are useful for future references to look at the evolution of the test and to compare the different behaviors on the various lines.

A single LabVIEW LOG file is updated at the end of the test, containing only the results of the integration and the test date/time and the name of the Sector under test (supplied by the user in the Run control part of the VI).

Analyzing Data

Rapid viewing VIs were implemented to quickly look at the data: a reader for the LOG file to get information on the final results obtained online by the test; a 3-D viewer showing the evolution of the testing pressure and of the gas flow and a converter to the LabVIEW Measurement file (binary format) in order to be compatible with DIAdem software. The first two VIs are shown in the Figure 4.

We intend to further develop these VIs to deeply automate the analysis process: once the stream files are converted into .lvm format, script and reports must be prepared in DIAdem to accommodate and present data. We plan to develop a converting VI from LabVIEW LOG file to MS Access database using the NI LabVIEW Database Connectivity Toolkit.

Author Information:
R. De Asmundis

Bookmark and Share

Explore the NI Developer Community

Discover and collaborate on the latest example code and tutorials with a worldwide community of engineers and scientists.

‌Check‌ out‌ the‌ NI‌ Community

Who is National Instruments?

National Instruments provides a graphical system design platform for test, control, and embedded design applications that is transforming the way engineers and scientists design, prototype, and deploy systems.

‌Learn‌ more‌ about‌ NI