Call Us: +91 87544 83577

esales@gmt.co.in

Mon - Sat 09:00 - 17:30

Sun - CLOSED

Blog

Systems of Measurement

METROLOGY – SYSTEMS OF MEASUREMENT

A system of measurement is a collection of units of measurement. Systems of measurement have been regulated and defined for the purposes of science and commerce. Systems of measurement in modern use include the Metric System, the Imperial (English) System, and The International System of units (SI).

LINEAR MEASUREMENTS

Comparisons of a given dimension with a particular standard of length. Linear measurements are basically comparisons of a given dimension with a particular standard of length. Different standard systems of measurements are as below:

The English system:

It is also known as the British System . The system makes use of yard as the standard system of measurement. These systems of measurement of weights and measures used officially in Great Britain from 1824 until the adoption of the metric system beginning in 1965.

The Metric system:

In metric system the meter replaces the yard (of English system). The metric system is essentially decimal, all multiples and submultiples of the basic units being related by the factor of 10. As decimeter, centimeter, millimeter, micrometer and so on.

The International System of units (SI):

Earlier CGS (centimeter, gram, second) system was in use for specific work. The MKS (meter, kg, sec) system followed it. This MKS system is very convenient to use and to handle rational, coherent, comprehensive systems at ease. This most popular and widely used systems of measurement is the SI.

The main features of SI are that the meter and kilogram supersede the centimeter and gram of the old system. The Newton , unit of force, is independent of the earth’s gravitation and g need no longer be introduced in equations. The joule (Newton X meter) is the unit of energy, and of power, the joule per second (watt). Units such as meter, kilogram, second and ampere is of maximum interest to engineers.

STANDARDS OF MEASUREMENTS:

Standards of measurement of length are :

Fundamental or primary standards:

The length is defined as the distance between two standard lines, and hence these standards are called Line Standards. The length of the workpiece is compared with a scale, which can be a yard or a meter. Scales are subject to parallax errors. It is also not convenient for close tolerance length measurement with unaided eye.

Secondary or working standards:

These standards are derived from primary standards. The length is the distance between two flat parallel end faces. Thus, they are End standards. They consists of slip gauges, micrometer, vernier calliper, etc. End measurement is more accurate because it is possible to detect much smaller variations by feel than that can be seen with the unaided eye. But this method is time-consuming in use and is subject to wear on their measuring faces.

Wavelength standards:

In these standards, the wavelength of the monochromatic light is the unit of length.

ERRORS IN MEASUREMENTS:

True fact is that, no measurement is exact. All measurements are subject to some error. It is therefore necessary to state not only the measured dimension, but also the accuracy of determination. As far as possible the errors inherent in the method of measurement used should be kept to a minimum. Thus, having minimized the error, its probable magnitude, or accuracy of determination should be stated.

Along with the actual gauge block size details, there should be details regarding the measured error in the block. Also, enclose the accuracy of determination of the block. The accuracy of determination can  improve by repeating the measurement a number of times and stating the mean value.

TYPES OF ERRORS:

There are two types of errors –
  • Those which should not occur and can be eliminated by careful work and attention.
  • Those which are inherent in the measuring process. Misreading an instrument, arithmetic errors, alignment errors, parallex error and errors due to temperature. These are some of the errors that we can eliminate on proper procedural handling of the system.

WHEN WE CAN TRULY BELIEVE A MEASUREMENT?

We can never have 100% confidence in a measurement. No measurement is ever correct. There is always an unknown, finite, non-zero difference between a measured value and the corresponding true value. Most instruments have specified or implied tolerance limits. The true value of the measurement should lie within these limits if the instrument is functioning correctly. One can never be 100% sure that an instrument is operating within its specified tolerance limits.

MINIMIZE ERRORS

There are steps we can take to minimise the probability of a measurement falling outside specified tolerance or uncertainty bands. Besides, regular traceable calibration is a method for gaining quantifiable confidence in a measurement system.

To know the fundamental aspects of calibration and its importance, requires a special standard to govern the calibration process. For example, if we consider about the pressure transducer. There are number of modes in which electronic circuitry and the digital display can fail or malfunction in large. Besides, most of the faults and malfunctions would not be visible to an operator. Therefore, it is impossible to verify the absence of faults and electronic drift by simple inspection.

Also, we cannot tell by inspection if the instrument has recently been dropped, subjected to an over-range pressure or otherwise mistreated.

CALIBRATION

When we make a measurement in the field we have no choice but to trust the instrument. The only way we can gain confidence in the electronic manometer is by regular comparison. Compare its response with another similar or preferably superior instrument in which you have a high level of confidence. Finally, calibration is a quantitative comparison or verification of the performance of an instrument.

Please visit us at https://gmtmachinetools.blogspot.in – a new blog dedicated to all GMT customers…interact with our technical experts

Leave a Reply