Systems of Measurement

Systems of Measurement

Metrology – Systems of measurement

Systems of measurement define units of measurement and specify how these units should be stated, used and measured. These are regulated and defined in science and commerce. Systems of measurement in current use include the Imperial (English) System, the CGS System,  the Metric System, and The International System of Units (SI).

The English system:

It is also known as the British System. The system makes use of the yard as the standard system of measurement. These systems of measurement of weights and measures used officially in Great Britain from 1824 until the adoption of the metric system beginning in 1965. Various standard systems of measurements are explained below:

The Metric system:

In the Metric system, the meter replaces the yard (of the English system). The metric system is essentially decimal, all multiples and submultiples of the basic units being related by the factor of 10. So, we can define the multiples and fractions of a metre, for example, as follows:

The metre is defined as the length of the path travelled by light in a vacuum in 1/1299,792,458 second. The metre was originally defined in 1793 as one ten-millionth of the distance from the equator to the North Pole along a great circle, so the Earth’s circumference is approximately 40000 km.

millimeter            =        1/1000 of a metre

centimeter           =        10 millimetres

decimeter             =        1/10 of a metre

micrometer          =        1/1000000 of a metre or 1/1000 of a millimeter

Note that units are not in capital unless referring to Proper names such as Kelvin or Tesla.

The International System of units (SI):

Earlier, the CGS (centimetre, gram, second) system was in use for specific work. The MKS (meter, kg, sec) system followed it. This MKS system is very convenient to use and to handle rational, coherent, comprehensive systems with ease. The most popular and widely used system of measurement is the SI system.

The main features of the SI system are that the meter and kilogram supersede the centimetre and gram of the old system. The Newton, unit of force, is independent of the Earth’s gravitation and g need no longer be introduced in equations. The Joule (Newton X meter) is the unit of energy, and of power, the Joule per second (watt). Units such as meter, kilogram, second and Ampere is of maximum interest to engineers.

Standards of Measurements:

Standards of measurement of length are :

Fundamental or primary standards:

The Primary standard for length in terms of the speed of light has already been given above.

Secondary or working standards:

These standards are derived from primary standards. In the secondary standard, length is the distance between two flat parallel end faces, which have been calibrated against the primary standard. Thus, they are End standards. They consist of slip gauges, micrometre, vernier callipers, etc. End measurement is more accurate because it is possible to detect much smaller variations by feel than that can be seen with the unaided eye. But this method is time-consuming in use and is subject to wear on their measuring faces.

Wavelength standards:

In these standards, the wavelength of monochromatic light is the unit of length. For this, we use the wavelength of monochromatic light (red radiation) from, Krypton isotope 86 as a unit of length.

Systems of measurement – Errors in measurement

No measurement is exact. All measurements are subject to some error, such as human error, or errors due to environment. It is, therefore necessary, not only to state the measured dimension but also the accuracy of determination. We should keep the errors inherent in the method of measurement to a minimum. Thus, having minimized the error, its probable magnitude, or accuracy of determination, should be stated.

Along with the actual gauge block size details, there should be details regarding the measurement error in the block. Another parameter to mention is the accuracy of determination of the block. The accuracy of determination can improve by repeating the measurement several times and stating the mean value.

Types of Errors

There are two types of errors –
  1. Those which we can eliminate by careful work and attention to detail.
  2. Those which are inherent in the measuring process, such as misreading an instrument, arithmetic errors, alignment errors, parallax error and errors due to temperature.

These are some of the errors that we can eliminate by proper procedural handling of the system.

When We Can Truly Believe A Measurement?

We can never have 100% confidence in measurement. No measurement is ever 100% correct. There is always a non-zero difference, unknown or finite, between a measured value and the corresponding true value. Most instruments have specified or implied tolerance limits. The true value of the measurement should lie within these limits if the instrument is functioning correctly. One can never really be one hundred per cent sure that an instrument is operating within its given tolerance limits.

Minimize Errors

There are steps we can take to minimize the possibility of a measurement falling outside specified tolerance. Besides, regular traceable calibration is a method for gaining quantifiable confidence in a measurement system.

To know the fundamental aspects of calibration and its importance, a special standard to govern the calibration process is necessary. If for example, we consider the pressure transducer. Manual instruments, electronic circuitry and digital displays can fail or malfunction in several modes. Besides, most of the faults and malfunctions may not be visible to an operator. Therefore, it is impossible to verify the absence of faults and electronic drift by simple inspection.

Also, we cannot tell by inspection if the instrument has recently been dropped, subjected to an over-range pressure or otherwise mistreated.


When we make a measurement in the field, we have no choice but to trust the instrument. The only way we can gain confidence in the instrument is by regular comparison and then compare its response with another similar or preferably superior instrument in which you have a high level of confidence. Finally, calibration is a quantitative comparison or verification of the performance of an instrument against a fundamental standard.

Please visit us at – a new blog dedicated to all GMT customers…interact with our technical experts

Call Now Button