Various specific terms describe the characteristics and quality of measuring instruments, defined as follows:
- Accuracy: The degree of agreement of the measured dimension with its true magnitude.
- Amplification: The ratio of measuring instrument output to the input dimension; it is also called magnification.
- Calibration: The adjustment or setting of a measuring instrument to give readings that are accurate within a reference standard.
- Drift: An instrument’s capability to maintain its calibration over time; also called stability.
- Linearity: The accuracy of the readings of a tool over its full working range.
- Magnification: The ratio of measuring instrument output to the input dimension; also called amplification.
- Precision: Degree to which a measuring instrument gives a repeated measurement of the same standard.
- Repeat accuracy: The same as accuracy, but repeated many times.
- Resolution: the Smallest dimension that can be read on an instrument.
- Rule of 10 (gage maker’s rule): An instrument or gage should be ten times more accurate than the dimensional tolerances of the part being measured. A factor of 4 is known as the mil standard rule.
- Sensitivity: Smallest difference in a dimension that an instrument can distinguish or detect.
- The speed of response: How rapidly a measuring instrument indicates a measurement, particularly when some parts are measured in rapid succession.
- Stability: An instrument’s capability to maintain its calibration over time; also called drift.
The selection of an appropriate measuring instrument for a particular application also depends on four factors, they are
- The size and type of parts to be measured,
- The environment (temperature, humidity, dust, pressure, and so on),
- The skills required by the operator, and
- The cost of equipment.
All Comments
A very informative post. Measuring instruments to fine precision is a daily task for me in the CNC industry