Pressure Measurement and Metrology
Introduction
The pressure measurement procedure is widespread within industrial processes and covers several sectors, such as steel industry, metallurgy, automotive, meteorology, aviation, among others.
The reliability of these measurements is related to trade, quality, health, and safety issues. Thus, an erroneous measurement can lead from quality problems in the industrial process to even fatal accidents. Therefore, companies are looking for powerful and reliable ways to monitor their processes. For this monitoring to be effective, it is not enough to install several devices and pressure gauges in the factory plant. It is necessary to make sure that the value measured by the instrument is correct. And that is where Metrology enters the picture.
According to the International Vocabulary of Fundamental and General Terms in Metrology (VIM), metrology is “the science of measurement”. It covers the aspects that influence a measurement process. Metrology assumes relative importance with the globalization of production, since measurements are present in all decision-making processes and include the industrial, commercial, health, and environmental sectors.
Calibrate a pressure gauge is a complex activity. Several requirements imposed by the relevant standards must be met. The execution of the calibration is the responsibility of laboratories accredited by Inmetro or laboratories that have traceability to national standards.
Definitions
Pressure: The pressure measurement is taken from a reference value. According to the referential used, the modes of measured pressure are: Absolute pressure, gauge pressure, vacuum, and differential pressure.
Absolute Pressure: The absolute pressure (pabs) is the pressure that is above “absolute zero” pressure.
Gauge Pressure (Relative or Positive): The gauge pressure is a special case of differential pressure measurement, when the measured absolute pressure is higher than the local atmospheric pressure. The gauge pressure measures how much its value is above the local atmospheric pressure.
pe = pabs − patm
Vacuum (Negative Pressure): Vacuum is a special case of differential pressure measurement, when the measured absolute pressure is lower than the local atmospheric pressure. Vacuum measures how much the pressure is below the local atmospheric pressure.
pe = pabs − patm
Differential Pressure: The difference between p1 and p2 pressures is called “differential pressure”. In this pressure mode, the value of the referential pressure, p1 or p2, is not the local atmospheric pressure.
p = p1 − p2
Figure 1 – Basic pressure concepts
Calibration: Calibration is the set of operations that establish (under specific conditions) the connection between the values indicated in the measurement process and the corresponding values of the quantities established by standards.
Maximum permissible error: Extreme value of the measurement error, related to a known reference value, allowed by specifications or regulations for a measurement, measuring instrument or measuring system.
Hysteresis error: Maximum difference between increasing and decreasing indications (at any point on the scale) in a calibration.
Repeatability error: Maximum difference between a consecutive number of indications for the same pressure, under equal operating conditions, in the same direction of pressure application.
Linearity error: Maximum difference between the curve obtained from the average of increasing and decreasing indications in a calibration and the theoretical straight line of the instrument.
Fiducial error: The fiducial error of a pressure gauge is determined from the ratio between the largest measurement error of the instrument and the measurement range expressed as a percentage.
Note: The fiducial error determines the accuracy class of the instrument under calibration.
Accuracy class: Class of measurement instruments or measurement systems that meet established metrological requirements to keep measurement errors or instrumental measurement uncertainties within specified limits under specified operating conditions.
Calibration – example for digital pressure gauge
Calibration consists in comparing the values indicated by the pressure gauge being calibrated (LI) and the values indicated by the standard pressure gauge (LPT) when subjected to the same pressure levels (calibration points). In the calibration in question, it was used the indirect comparison method. The pressure was produced by a hydraulic pressure pump and the values indicated by the pressure gauge were compared with the values indicated by the standard pressure gauge.
Calibrated digital pressure gauge data:
Scale: 0 to 10 bar
Resolution: 0.01 bar
Accuracy class: A
Maximum permissible error: ± 1.0 % of full-scale value
Standard digital pressure gauge data:
Scale: 0 to 35 bar
Resolution: 0.001 bar
Accuracy class: 5A
Note: The relation between the accuracy of the standard and the instrument to be calibrated must be analyzed. To calibrate digital pressure gauges, the standard must have at least four times better resolution than the pressure gauge being calibrated, according to NBR 14105-2.
Calibration points:
According to the NBR 14105-2 standard, a pressure gauge with accuracy class “A” must be calibrated in at least 5 points. When the zero point exists, it must also be checked.
Point | Value (bar) |
Initial Point | 0.00 |
1st Point | 2.00 |
2nd Point | 4.00 |
3rd Point | 6.00 |
4th Point | 8.00 |
5th Point | 10.00 |
Values found during calibration:
Results obtained in calibration
Instrument Indication | Working Standard Indication (bar) | |||||
PRESSURE | 1st Cycle | 2nd Cycle | Rate | |||
SI (MPa) | Instrument (bar) | Increasing | Decreasing | Increasing | Decreasing | |
0.000 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 | 0.00 |
0.200 | 2.00 | 2.00 | 2.00 | 2.00 | 2.01 | 2.00 |
0.400 | 4.00 | 4.02 | 4.01 | 4.01 | 4.00 | 4.01 |
0.600 | 6.00 | 6.03 | 6.03 | 6.04 | 6.03 | 6.03 |
0.800 | 8.00 | 8.07 | 8.05 | 8.07 | 8.08 | 8.07 |
1.000 | 10.00 | 10.12 | 10.12 | 10.13 | 10.13 | 10.13 |
Metrological features presented in: (bar)
MEASUREMENT UNCERTAINTY: | 0.02 | REPEATABILITY | 0.03 |
MAXIMUM ERROR (FIDUCIAL ERROR): | -0.13 | HYSTERESIS: | 0.02 |
Metrological features presented in relation to the range of the instrument: (%)
MEASUREMENT UNCERTAINTY: | 0.20 | REPEATABILITY | 0.30 |
MAXIMUM ERROR (FIDUCIAL ERROR): | -1.30 | HYSTERESIS: | 0.20 |
When analyzing the results found in the calibration, it appears that the pressure gauge is presenting a maximum indication error (fiducial error) above the limit established for its accuracy class. It should either be replaced or sent to the technical assistance for adjustment.
Conclusion
In the example above, it was found that the measurement deviation of the pressure gauge was about 30 % above the permissible for its accuracy class. The impact of this deviation can be extremely harmful to the quality of the process to which the pressure gauge is applied. Furthermore, depending on the application, the safety and integrity of the equipment and the people involved in the process can be compromised.
This demonstrates the importance of metrology within industrial processes and reinforces the idea that it is not enough to have a monitoring system installed. It is necessary that the results presented by the system are reliable.
Article written by Fabiano Schneider Boff, NOVUS Metrology Analyst.
References
NBR14105-2: Pressure gauges – Part 2: Digital pressure gauges – Requirements for manufacturing, classification, testing, and use
International Vocabulary of Fundamental and General Terms in Metrology (VIM)
DIMEC/GC-09: Transducer/Pressure transmitter calibration
DOQ-CGCRE-14: Guidelines for performing calibration of digital pressure gauges
DOQ-CGCRE-47: Guidelines for submitting a calibration certificate for pressure gauges
Read more:
NOVUS Metrology Laboratory renews the CGCRE/INMETRO accreditation during an unprecedent remote audit
Calibrating measuring instruments: How to help the customer skip this step