CPECN

Understanding the basics of uncertainty in measurement and calibration

Mike Edwards   

Features Calibration measurement SRP control systems


Estimating measurement uncertainty is one of the most challenging tasks that scientists and calibration technicians have to deal with. When an instrument is calibrated the measurements should be traceable back to a common standard.

All calibration labs around the world use the ISO Guide to the Expression of Uncertainty in Measurement (GUM) method to estimate measurement uncertainty.

Let’s look into the basics in Uncertainty in Measurement and Calibration.

What is the Uncertainty of Measurement?

Uncertainty is the range of possible values within which the true value of the measurement lies. It is the “doubt” of measurement. It explicitly tells us how good the measurement is. Every measurement has some “doubt” and we should know how much this “doubt” is, to decide if the measurement is good enough for the usage.

There are many things that affect the result of a measurement, the tools that are used, the method or process that was used, and the way the person did the job.

Error is not the same as uncertainty. In calibration, when we compare our device to be calibrated against the reference standard, the error is the difference between these two readings.

It is critical to be able to distinguish between uncertainty and error.

Standard Deviation of the Measurement

The standard deviation is a statistic that measures the dispersion of a dataset relative to its mean and is calculated as the square root of the variance. The goal is to specifically get the typical deviation of that whole measurement process is and use the knowledge as an uncertainty component related to that measurement.

Being aware of the standard deviation of your calibration process is, in fact, one important part of the total uncertainty.

The Reference Standard (Calibrator) and its Traceability

One of the biggest sources of uncertainty comes from the reference standard or calibrator that you are using in your measurements/ calibrations. To start with, you should select a suitable reference standard for each measurement. It is also essential to note that it is not enough to use the manufacturer’s accuracy specification for the reference standard and keep using that is the uncertainty of the reference standards long term. Finally, you must have your reference standards calibrated regularly. It also needs to be in a calibration laboratory that has sufficient capabilities to calibrate the standard and to make it traceable.

It is important to realize the total uncertainty of the calibration that the laboratory documents for your reference standard. First, follow the stability of your reference standards between its regular calibrations. Then after some time, you will learn the true uncertainty of your reference standard and you can use that information in your calibrations.

Compliance Statement: Pass or Fail

When an instrument is calibrated it has a pre-defined tolerance limit that it has to meet. Tolerance levels are the maximum levels indicating how much the result can differ from the true value. If the errors of the calibration result are within the tolerance limits, it is a passed calibration. When some result errors are outside of tolerance limits, it is a failed calibration.

The GUM 8-step Process for Calculating Uncertainty:

  • Describe the measured value in terms of your measurement process.
  • List the input quantities.
  • Determine the uncertainty for each input quantity.
  • Evaluate any covariances/correlations in input quantities.
  • Calculate the measured value to report.
  • Correctly combine the uncertainty components.
  • Multiply the combined uncertainty by a coverage factor
  • Report the result in the proper format.

This article was originally presented on the SRP Control blog here.

 


Print this page

Advertisement

Stories continue below