Introduction to Universal Testing Machines
A universal testing machine (UTM), is also generally known as
- Universal tester
- Materials testing machine
- Materials test frame
- Force testing Machine
- Tensile tester,
- Compression tester
A UTM is an innovative multi-purpose instrument that has a plethora of applications and is beneficial for QC departments and R&D labs. It is primarily utilized to test the tensile and compressive strength of materials. Previously a tensile testing machine was referred to as a Tensometer.
The term “universal” in the nomenclature denotes the ability of the equipment to perform various operations such as standard tensile and compression tests on materials, components, and structures.
There are several types of universal testing machines, some of them have been trimmed and marketed to specific sectors. Such developments have brought about specific names such as “texture analyser” for food, “top load compression tester” for packaging and pipes, and “peel tester” for adhesives, tapes, and labels.
In the event of an incorrect measurement, there is a probability that a flawed decision may be taken. The consequences of this could lead to massive recalls, product failures, humungous financial loss and even loss of life.
Pros & Cons of Accurate Force Testing
Accurate measurement of force by using load-cells / force transducers has become progressively imperative while designing safer buildings, evaluating the strength of materials, controlling production processes, thrust of jet engines, rockets, aircrafts, and gas turbines. As well as weighing of aircrafts on weigh bridges, controlling pressure in rolling mills, to engineering safety products.
Highly precise force transducers are used as reference in calibration of universal testing machines (UTM), calibration of force transducers, load-cell calibration, force proving instruments, etc.
Force transducers are also utilized in material testing machines, spring test stands, crimp force test stands, push-pull gauges and other custom built test rigs.
Force measurement is an undeniable facet in many industries but the precision in testing could vary in parts per million (ppm) depending on the application.
Force is a vector quantity having both magnitude and direction unlike length and mass which are scalar quantities. Adopters are important while carrying out calibration and testing to reduce parasitic forces interference.
Force Calibration Requirement
As per Standards ISO 376 and ASTM E74-18
Uniform, valid, precise and internationally compatible accurate measurements and calibrations are beneficial in eliminating technical barriers, to enhance efficiency, quality and safety. Therefore it is advisable to inculcate National/International Standards of calibration.
As per ASTM E74-18, Standard practices for calibration and verification for Force-Measuring Instruments: Force measuring instruments used for the verification of force indication systems may be calibrated either by primary or secondary force standards (5.1of ASTM E74-18).
Primary force standards for calibration of Class AA and Class A force measuring device is as follows. A dead weight force applied directly without intervening mechanisms such as levers, hydraulic multipliers, or the like, whose mass has been determined by comparison with reference standards traceable to the International system of Units (SI) of mass. (3.1.2 of ASTM E74-18)
Secondary force standards with Class AA reference force measuring device for calibration of class A force measuring device: An instrument or mechanism, the calibration of which has been established by comparison with primary force standards (3.1.3 of ASTM E74-18).
Secondary force standards may be either force measuring instruments used in conjunction with a machine or mechanism for applying force, or some form of mechanical or hydraulic mechanism to multiply a relatively small deadweight force.
Examples of the latter form include single-and multiple-lever systems in which force acting on a small piston transmits hydraulic pressure to a larger piston. (6.2 of ASTM E74-18)
Force measuring instruments used as secondary force standards shall be calibrated by primary force standards and used only over the class AA verified range of forces. (6.2.1 of ASTM E74-18)
The masses of the weights shall be determined within 0.005% of their values by comparison with reference standards traceable to the International System of Units (SI) for mass. The local value of the acceleration due to gravity, calculated within 0.0001m/sec2 (10 milligals), may be obtained from the National Geodetic Information Centre, National Oceanic and Atmospheric Administration. (6.1.2 of ASTM E-74-18)
For force measuring instruments used as secondary force standards, the LLF (lower limit of Force) of the instrument shall not exceed 0.05% of the force. The Lower force limit of the force measuring instrument as expressed by Equation (7) (over the Class AA verified range of forces is therefore 2000 times the LLF, in force units, obtained from the calibration data. (8.6.3.1 of ASTME74-18)
Calibration & Verification Of Static UTM
Therefore, applied uncertainty for Primary force standard: Dead weight force calibration machine is 0.005% of mass and for Local value of Gravity 0.0001m/sec2
Secondary standard: Secondary force calibration systems, LLF of the force measuring instrument shall not exceed 0.05% of the force.
As per the standard ISO 376 Calibration of force-proving Instruments used for the verification of uniaxial testing machines:
The applied force uncertainty required for calibration of different class force proving instrument are as follows:
Class of Accuracy | Uncertainty of applied calibration force required with k=2 |
00 | ± 0.01% |
0.5 | ± 0.02% |
1 | ± 0.05% |
2 | ± 0.1% |
Specification of uniaxial Testing machine (UTM) if, used as a force calibration machine for calibration of force proving instrument.
As per the standard ISO 7500-1:2018” calibration and verification of static uniaxial testing machines” | ||||||
Class of the Machine | Maximum permissible values % | Class of Force proving instrument to be used for calibration as per ISO 376 | ||||
Relative error of | ||||||
Accuracy (q) | Repeatability (b) | Reversibility* (v) | Zero (f0) | Relative resolution ‘a’ | ||
0.5 | ± 0.5 | 0.5 | ± 0.75 | ± 0.05 | 0.25 | 0.5 |
1 | ± 1.0 | 1.0 | ± 1.5 | ± 0.10 | 0.5 | 1 |
2 | ± 2.0 | 2.0 | ± 2.0 | ± 0.2 | 1.0 | 2 |
3 | ± 3.0 | 3.0 | ± 3.0 | ± 0.3 | 1.5 | 2 |
UTM Testing Procedure
A universal testing machine (UTM) is used to test the mechanical properties (Tension, compression, etc.,) of a given test specimen by exerting Tensile, compressive or transverse stresses. UTM has been named so because of the wide range of tests it can perform on various kinds of materials.
Principle of operation of these machines is by hydraulic transmission of load from the test specimen to a separately housed force indicator. The pressure developed by the hydraulic machine is converted into force values on the indicator. Calibrating by using different class and capacity of force transducer depends on the range and accuracy of the machine.
Some universal testing machines use built-in force transducer having specified range and accuracy depending on its capacity. If the UTM has built in Force transducer with a fixed capacity, calibration of different force transducers may not be feasible because of the limitation in its measurement uncertainty range to cater to minimum and maximum range of the unit under calibration.
Alternatively, additional reference force transducer with suitable range, (min. and max.) capacity is used and UTM is used only as a force generating system with proper mounting adopters between the reference and the test transducers, to avoid parasitic forces during application of force, it may be feasible to calibrate roughly the force transducer.
As per the standard ISO 376, calibrated Force proving instruments are utilized for verification of uniaxial testing machines that means they are more accurate than the UTMs.
Hence, it may not be feasible to use the UTMs for calibration of the Force measuring instruments, as the measurement uncertainty level of UTMs is lesser due to their inherent qualities. It is like calibrating a master pressure gauge using ordinary pressure gauge. This is one of the disadvantages of a universal testing machine.
There are many classes of accuracy in universal testing machine (UTM) as mentioned in the table at ‘C’ above. The best measurement uncertainty that can be achieved even by using 0.5 class of accuracy UTM cannot be less than 0.25% for the entire range of calibration.
The applied uncertainty of force itself using a 0.5 class force transducer is not less than 0.15%. Even if class 00 Force transducer with applied uncertainty of 0.08% is used for calibration it is practically impossible to achieve measurement uncertainty of < 0.1 to 0.12%.
Even to calibrate low quality force proving instrument of class 2 the applied uncertainty requirement as per ISO 376 is 0.1% Therefore, the most accurate 0.5 class UTM if used, one cannot achieve 0.1% or better applied uncertainty for calibration of a force transducer.
ASTM E74 -18 specifies for secondary standard, Force transducer of Class AA of 0.05% to be used as reference to calibrate Class A Force transducer.
Hence, even using an UTM of class 0.5 for calibration of Class 2 Force transducer is ruled out either as per the Standard ISO 376 or as per ASTM E74-18 let alone other better class of force transducers.
It may be possible to calibrate force transducer if, most accurate custom built UTM with built -in reference Force transducer of best accuracy is developed and measurement uncertainty better than 0.05% for class A and 0.1% for Class 2 only is achievable. Let alone other higher classes.