Changsha Panran Technology Co., Ltd.
Linearity of the gauge
Source: | Author:SW | Published time: 2024-11-27 | 7 Views | Share:

Linearity is given a narrow interpretation in this Handbook to indicate that gauge response increases in equal increments to equal increments of stimulus, or, if the gauge is biased, that the bias remains constant throughout the course of the measurement process.

A determination of linearity requires Q (Q > 4) reference standards that cover the range of interest in fairly equal increments and J (J > 1) measurements on each reference standard. One measurement is made on each of the reference standards, and the process is repeated J times.

A test of linearity starts with a plot of the measured values versus corresponding values of the reference standards to obtain an indication of whether or not the points fall on a straight line with slope equal to 1 -- indicating linearity.

least-squares fit of the data to the model

Y = a + bX + measurement error

where Y is the measurement result and X is the value of the reference standard, produces an estimate of the intercept, a, and the slope, b.

The intercept and bias are estimated using a statistical software package that should provide the following information:


  • Estimates of the intercept and slope, 

  • Standard deviations of the intercept and slope

  • Residual standard deviation of the fit

  • F-test for goodness of fit

Tests for the slope and bias are described in the section on instrument calibration. If the slope is different from one, the gauge is non-linear and requires calibration or repair. If the intercept is different from zero, the gauge has a bias.

The reference manual on Measurement Systems Analysis (MSA) lists possible causes of gauge non-linearity that should be investigated if the gauge shows symptoms of non-linearity.

  1. Gauge not properly calibrated at the lower and upper ends of the operating range

  2. Error in the value of X at the maximum or minimum range

  3. Worn gauge

  4. Internal design problems (electronics)

The requirement of linearity for artifact calibration is not so stringent. Where the gauge is used as a comparator for measuring small differences among test items and reference standards of the same nominal size, as with calibration designs, the only requirement is that the gauge be linear over the small on-scale range needed to measure both the reference standard and the test item.Sometimes it is not economically feasible to correct for the calibration of the gauge (Turgel and Vecchia). In this case, the bias that is incurred by neglecting the calibration is estimated as a component of uncertainty.