The Importance of Calibration

Why Do we need Calibration?

 

Measurements have an impact on nearly every aspect of modern life, from the humble paper clip to the complicated systems required for space flight and everything in between.

Now, some people may say “My instrument is electronic, why do I need to have it calibrated, there’s no mechanical things to wear out”, this is true there may not be any mechanical bits to wear out, but unfortunately electronics and the values of the components change (wear) with time.

So let’s imagine you’re making measurements on a precision or expensive process with an instrument that is giving incorrect readings, this could, and will probably be costly to remedy, and may even make the end product/process unsafe.

Or, two instruments are being used for logging and trending data for maintenance purposes, but both instruments are giving different results….which one is correct?… Without a current valid calibration certificate with adequate measurement uncertainties we could never be sure.


 

What is Calibration?

 

Put into simple terms calibration is the verification of a measuring tool against a “Standard” of known value and higher accuracy.

This comparison will validate the “Unit Under Test’s” specification as the “Standard” will be at least 5 times (ideally 10 times) more accurate than the UUT.

Everyone does a simple form of calibration at home, at least twice a year when the clocks change.

You hear the time tones on the radio or on the television, and adjust your watch accordingly. You then use your watch to correct the time on all the clocks in your home. Your watch has now become a transfer standard as you pass on the “corrected” time to all other time pieces.

Periodically you check your watch against the radio or television and adjust it if required.

In doing this you are calibrating your watch against a higher accuracy standard.

If you never check your watch against these higher standards how sure are you that your watch is correct? This is the same for every measuring device, it needs to be checked against a known standard of higher accuracy to ensure it is correct and performing within it’s specification limits.

 

Another simple example:

Two Digital Multimeters are required to measure 100 Volts within 1 %.

Meter A indicates 99.1 Volts, and Meter B indicates 100.9 Volts.

Both meters are within tolerance, but which one is right?

If Meter B is used as the “Standard” then Meter A will appear to be out of tolerance.

Meter B is sent for calibration and is adjusted to indicate 100.0 Volts. This meter now has an accuracy of 0.1 % and therefore the most it could deviate is 99.9 Volts to 100.1 Volts.

Now if Meter A is compared with Meter B it is within tolerance (although right at the lower end of the tolerance range).

If Meter A is now adjusted to indicate the same as Meter B, this will hopefully keep Meter A from giving a false reading as it experiences normal drift between calibrations.