When an object is placed on a scale, it will display a measurement of the object’s weight. We know how close that displayed weight is to the accurate weight of the object when we understand the scale’s accuracy, precision, and resolution.
ACCURACY
The accuracy of a scale is a measure of the degree of closeness to the average value of an objects displayed weight to the object’s actual weight.
- If, on average, a scale indicates a 200 lb. certified test weight is 200.20 the scale is accurate within .20 lbs. in 200 lbs. – or 0.1%
PRECISION
The precision of a scale is a measure of the repeatability of an object’s displayed weight for multiple weighings of the same object.
- If the displayed weights of an object that weighs 200 lbs. are: 200.20, 200.30, 200.15, 200.10, 200.25 – then the average weight is still 200.20, but the value deviates as much as .10 lbs. compared to this average. The precision would be shown as +/- 0.10lb. Meaning the fluctuations are limited to 0.10 lbs.in either direction.
RESOLUTION
The resolution of a scale is the smallest increment in applied weight that can be detected or displayed in the scale.
- In all scales this is the quantity most affected by the digits that are displayed when an object is weighed.
In the example above, for precision the readout appears to show changes in 0.01 lb. increments, but the digits change by only 0.05 lbs. Even though the readout appears to provide weight to the nearest 0.01 lbs. the RESOLUTION of the readout is actually 0.05 lbs. – This is a common misconception regarding the “exactness” of a scale’s readings. Additionally, resolution can be affected by other factors such as; friction, wear, and scale design.
It is important to use the terms accuracy, precision, and resolution separately when determining the exactness of a scales output.