An overview of force and pressure sensing26 August 2021

The methods used to measure force in a range of applications and how these can be related to pressure sensing. By Jody Muelaner

Measurements of pressure are required in many applications including pneumatic and hydraulic systems, and environmental monitoring. Since pressure is defined as force per unit area, the measurement of pressure is closely related to the measurement of force. Some instruments which measure pressure have a calibrated area over which the pressure acts, and a calibrated force measurement device.

Since accurately measuring the dimensions of an area is relatively easy, the uncertainty of a pressure measurement is typically dominated by the measurement of force. Pressure sensors use the same force measurement methods as used by scales, ranging from small force measurement in laboratory scales, through to commercial vehicle weighbridges and load cells.

Absolute pressure is the actual force acting over an area, while gauge pressure is the pressure relative to the ambient pressure. Most instruments measure gauge pressure. A fluid cannot have a negative absolute pressure, although the gauge pressure may be negative.

MEASURING FORCE

The most common method used to measure force is to elastically deform a calibrated part, and then measure the deformation. The deformation may be directly measured, using a dimensional measurement device to measure the displacement. Alternatively, the change in electrical resistance that results from the deformation may be measured as in a strain gauge load cell.

Analogue scales: Traditional analogue scales use springs which deform when a load is applied, with the deformation causing an indicator to move along a scale. While a balance must be located on a horizontal surface to provide an accurate measurement of weight, scales which use springs can measure force in any direction.

Strain Gauge Load Cells: Many load cells use strain gauges to measure very small elastic deformations and therefore force. The strain gauges can be configured to measure deformation of the material in compression, tension, bending or torsion. Bending is better suited to small forces while tension and compression are better suited to large forces. Most strain gauges work by passing a current through a very thin conductor and measuring the change in electrical resistance that occurs when the material is stretched, reducing its cross sectional area as it extends. This resistance change is normally measured using a circuit called a Wheatstone bridge.

Piezoelectric Load Cells measure the dynamic effect when there is a change in force, but amplifiers can be used to effectively measure static force. They are useful where cyclic loading may affect strain gauges, or for a very wide range of difference forces with consistent signal to noise ratio.

Optical Strain Gauges: A section of an optical fibre can have a pattern of gratings etched into it. This creates a sensor since the spacing between the grating changes as the fibre is stretched, changing the frequency of the light that gets internally reflected along the optical fibre. These sensors are immune to electromagnetic interference and also much less effected by temperature than electrical strain gauges. They don’t require electricity which can be important in explosive environments. Optical strain gauges also allow hundreds of sensors along a single fibre, which is very useful for structural monitoring.

Hydraulic Force Sensing: For very large forces, hydraulic cylinders or diaphragms are often used to receive the force. The force can then be transferred and scaled down through a hydraulic system before being transferred to a load cell. Alternatively, a Bourdon gauge may be used, a coiled metal tube which uncoils as the pressure of the fluid inside it increases.

Capacitive Load Cells: These work by pushing the plates of a capacitor together, changing the capacitance. They can measure forces with smaller displacements, making them more robust than strain gauge load cells, and produce signals with lower noise.

FORCE CALIBRATION

Calibration is a process of comparing measurements with a reference of known accuracy. The uncertainty of the calibration includes both the uncertainty of the reference and uncertainties in the comparison process. Traceable measurement instruments have an unbroken chain of calibrations relating them to primary standards. In the UK standards at the highest level are maintained by the National Physical Laboratory (NPL).

Traceable measurement ensures that test results and specifications obtained in different places and at different times can be compared with each other. It is only through a system of traceability that we can have a global supply chain that supplies interchangeable components. Another example of the importance of traceable measurements is in tracing climate change. Measurements show that both CO2 and temperature levels have been rising steadily since the beginning of the industrial revolution. Without rigorous systems of traceability it would be impossible to know whether the temperature has actually been increasing or whether our temperature measurement has simply been drifting over this time.

Force is not one of the base units in the SI system. The kilogram, second and metre are the base units which enable force to be defined according to Newton’s laws of motion. A force measurement device can be calibrated using the fact that force is equal to mass times acceleration. At a given location, the acceleration due to gravity can be measured by simply dropping an object and accurately measuring its acceleration using a laser interferometer and atomic clock. These measurements of time and distance have direct traceability to physical constants – essentially tracing the measurements back to unchanging laws of physics. Once the acceleration due to gravity is known, a calibrated mass can be used to calibrate the force measurement device.

“A gravimeter can be used to measure acceleration due to gravity. This works by dropping an object and measuring its acceleration using an interferometer. There are gravity stations with calibrated values for g which can then be used for other calibrations,” says Ben Hughes, principal research scientist, National Physical Laboratory.

Since 2019, the kilogram has been defined in terms of physical constants, using a Kibble balance. This means that mass can now be calibrated without a direct chain of calibrations linking it back to any one physical artefact. The Kibble balance works by determining force from electromagnetic parameters. It can therefore be used to give traceable force measurements without using a calibrated mass. Currently, Kibble balances are large and expensive instruments, constructed in national measurement institutes. However, compact standardized devices are now being created for calibration labs. There is also work being done to create solid state devices, similar to accelerometers, which could put the direct traceability of a Kibble balance on a chip. These devices are likely to be the future of accurate force and pressure measurement.

Hughes adds: “In the past we’ve used a similar approach to the Kibble balance to measure micronewtons of thrust produced by ion thrusters. The device we created placed the thruster on a pendulum. When a thrust caused the pendulum to move, a feedback loop restored it to its starting position using the force produced by an electromagnetic coil. We were then able to determine the force by weighing the magnet in the field generated by the coil. This is difficult to do at such low force levels. If we were doing this today, we would use the principles of the Kibble balance to self-calibrate the coil parameters.”

Jody Muelaner

Related Companies
NPL Management Ltd

This material is protected by MA Business copyright
See Terms and Conditions.
One-off usage is permitted but bulk copying is not.
For multiple copies contact the sales team.