Precision Measurement Tools for Manufacturing
Since 1938, Mitutoyo has been committed to producing high-quality, affordable measurement tools for manufacturing applications of all types. Today, Mitutoyo is the benchmark brand for calipers and other measuring tools.
Manufacturing tolerances are shrinking and are now much tighter than they used to be. Therefore, the ability to measure accurately is becoming increasingly important as well as being a vital aspect of the quest to maintain and improve product quality. Continuous reduction in variability – a key concept in quality improvement – is achieved by minimizing bias and variance in data. By learning how to measure correctly any personal bias that could affect measurement data is minimized. It should never be forgotten that untrustworthy data yields poor quality. How to improve the trustworthiness of measurement data is one of the central issues addressed by Mitutoyo Calipers.
Calipers and Units of Length
Down through the ages the common units of length were derived from the features of a human limb. Apart from the foot the fathom (across the outstretched arms) is still used today, and the cubit (from the bent elbow to the tip of the middle finger) was the standard unit of length in ancient Egypt. Although not very accurate, these units were readily available to everyone, rich or poor, and they worked well enough until the advent of the industrial age demanded accurate, precise and consistent measurement.
By the time of the French Revolution in 1789 – one of the most turbulent and chaotic periods in recent history – many scientists and philosophers in France had become convinced that the fundamental unit of length must be derived from a physical constant so that its magnitude would be invariant for all time.
When the metre was finally defined, after an epic seven-year journey by two French astronomers who measured the distance from Dunkerque to Barcelona by triangulation, the modern unit of length was found to be, quite ironically, only twice as long as the ancient Egyptian cubit. Metre or yard, the standard of length produced by man reflects his scale. The issue, however, is not the magnitude of the length itself but universal acceptance.
In today’s terms, when the proclamation À tous les temps, à tous les peuples (for all time, for all peoples) was made it would be called a mission statement designed to promote the metre as the universal unit of length. Yet the advantages of Imperial units should not be disregarded. Fractions such as 1/4 and 1/8 are useful, as is the physical size of 1 inch and its multiple 12 inches.
The Search for a Measurement Standard
During the eighteenth century, after the passing of Sir Isaac Newton (1642-1727), the momentum of progress in mathematics, geometry, philosophy and other scientific inquiries moved from England to France. Long before the French Revolution, intellectuals and scientists in France were talking about the need for a new standard of length based on a physical constant.
One serious proposal was to define the new unit as the length of a standard pendulum. But the period of a pendulum depends on gravity, and so its length would vary slightly from one place on the earth to another, especially with latitude. Therefore, this idea was rejected, although it was suggested that 45ºN latitude could be used as a standard location. Instead, a proposal to define the new unit as one ten-millionth of the length of a quadrant of the earth lying between the north pole and the equator, measured along the Paris meridian, became the favored plan. This extraordinary astronomical-scale definition was duly approved by the French Academy of Sciences and the members also suggested a name for it, Mètre, and decreed that it would use the decimal system.
The name chosen by the Academy was derived from the Greek Metron and Latin Metrum for measure. Determining its magnitude implied measuring one quadrant of the earth’s surface from the frozen wastes of the Arctic to a point in the tropical Atlantic Ocean. The obvious difficulties this posed were avoided by calculating the distance based on measurement of only a segment of this quadrant – in fact between Dunkerque and Barcelona. Both points are at sea level and the distance between them is the longest land-based meridian through Paris. In fact, Casini de Thury and La Caille had, in 1739-40, during construction of the first accurate maps of France, already made a similar measurement (from Dunkerque to Collioures) but the Academy hoped for even better accuracy this time by using the latest instruments.
Once this segment was surveyed and the data extrapolated to calculate the so-called Great Arc and corrected for Earth’s slightly oblatespheroidal form, the result of 5,132,430 toises (a toise being a little more than six feet) was then divided into 10,000,000 equal parts to define the new unit, but whether the public would accept it or not was another matter because Napoleon brought back the old units, leaving the metre on the brink of extinction before it had hardly got started.
Birth of a Measurement Standard: The Egyptian Cubit
The length called the fathom, a generic unit used in many cultures since ancient times, probably has its origin in the length of rope a man can hold between his outstretched hands. Another unit called the shaku that originated in ancient China and migrated to Japan was the length of an outstretched palm from the edge of the thumb to the tip of the middle finger. These examples illustrate the point that for all length standards in the past it was only natural to base length measures on the size of man’s body because everyone possessed roughly the same size and could use it whenever needed.
The ancient cubit was the distance between the bent elbow and the tip of the middle finger of a powerful Pharaoh. Even without being able to read hieroglyphics it is clear that certain characters and symbols represent this unit. By dividing the cubit, which is approximately 500 mm long, into 28 parts, one part becomes approximately 18 mm, a known unit with a distinct symbol, which in turn was divided into two, three, four and sixteen, which appears to be the smallest graduation.
Strange, but true, is that many centuries later when the metre was defined by a natural constant with two astronomers surveying the land to estimate the circumference of the earth, the new unit of length turned out to be about twice as long as the ancient cubit. Had they divided the Great Arc into twenty million parts instead of ten, the cubit could have been today’s standard of length.
What Is a Caliper and How Are They Used?
Calipers of all kinds be they Vernier, Digital, or Dial are general purpose tools. They measure inside, outside, depth, and even steps. Users of calipers include dentists, scientists, archaeologists, mechanics, machinists, chemists and anthropologists: in short, anyone who must take measurements. Having such diverse users and the broadest of applications, calipers sometimes receive harsh treatment. For this reason most caliper jaws are heat-treated and usually hardened to 62HRC or better.
Almost all modern calipers are made of flame hardened stainless steel, which is sufficient for the rigours of normal use. However, when measuring very hard or abrasive workpieces such as grinding wheels and cemented carbide cutting tools, these calipers may still suffer wear. Tungsten-carbide inserts in the jaws greatly extend the useful life of calipers by providing maximum hardness jaws.
Dial calipers are just as versatile as their vernier counterparts, with the added benefit of being easier to read because of the considerable mechanical scale magnification, which can be as great as 100 to 1. However, due to the complexity of the moving parts needed, this type is generally more expensive than the vernier caliper and is vulnerable to swarf and dust contamination. > Resolution .001 inch. Uncertainty ± .001 inch (0-6 inch range).
Carbide inserts may also be incorporated in the inside jaws. If the outside jaws (without carbide inserts) are bent, they can be restored to the original condition by inserting a discshaped lapping stone between the jaws and removing material until parallelism is restored. Inside jaws may be heated and bent back into shape.
Calipers and the Vernier Scale: The Most Versatile of all Gauges
Because of its several measuring modes, ease of operation, durability, wide measuring range and relatively low cost, a caliper is possibly the best general-purpose tool to have in the toolkit. However, due to its design, a caliper is not compliant with Abbe’s Principle so care is needed when using it if accuracy is to be maximized. The essence of Abbe’s Principle is that if the axis of measurement is not coaxial with the measuring scale axis then there is potential for error. The effects of this principle, however, can be largely avoided by observing a few simple practices.
How to Read a Vernier Caliper
The vernier scale is a device that allows one to read an evenly divided straight (or circular) measurement scale to a far greater resolution than is provided by that scale’s smallest divisions. It works by dividing these up using an auxiliary scale (the vernier scale) that slides against the main scale. The modern form of this invention was developed by the French mathematician Pierre Vernier (1580–1637).
Reading the vernier caliper, especially the metric type, is fairly straightforward although one does need good eyesight, or a magnifying glass, because there is no mechanical magnification as on a dial caliper. Parallax error is also a factor that needs to be guarded against as the scales are on slightly different levels.
The vernier scale is attached to the caliper’s slider and, on a typical metric model, each division on this scale is 0,02 mm shorter than one main scale division of 1 mm. This means that, as the caliper jaws open, each movement of 0,02 mm brings the succeeding vernier scale line into coincidence with a main scale line and hence indicates the fraction, in 0,02 mm units, of the main scale division to be counted. On the inch vernier, the main scale has .025 inch divisions with the divisions on the vernier scale .001 inch shorter than two divisions of the main scale. This feature makes the scale easier to read by doubling the spacing of the graduations, but the principle is still the same, giving a resolution of .001 inch.
How to Read Dial Calipers
Dial calipers are just as versatile as their vernier counterparts, with the added benefit of being easier to read because of the considerable mechanical scale magnification, which can be as great as 100 to 1. However, due to the complexity of the moving parts needed, this type is generally more expensive than the vernier caliper and is vulnerable to swarf and dust contamination.
How to Read Digital Calipers
Conventional digital calipers make use of a basic binary system: they have a series of light and dark bands under the slider, and count these as they move along the track. There is no way the system can tell where the slider is from the patterns on the track: it purely depends on storing the number of bands passed over. Because of this it is the case with most digital calipers that immediately after switching on, and before making a measurement, the jaws must first be closed and the display zeroed to reset the binary system before it starts counting the bands.
As the vernier caliper design can read any point within its range without having to reset zero, this digital caliper reading system seemed cumbersome. This changed with the introduction of Mitutoyo’s ABSOLUTE digital caliper which can read the slider location at any position and at any time, even after powering off, without needing to reset zero. According to one school of thought the best caliper is still the old-fashioned vernier caliper which is simple to use, inexpensive, and just as accurate as the latest digital calipers. However it is undeniable that they are hard to read at times (particularly the inch versions), cannot convert seamlessly from inch to metric (and vice versa) and cannot switch between absolute and incremental measuring modes.
The ABSOLUTE type digital caliper takes advantage of the best of both worlds: analogue and digital. The ABSOLUTE digital caliper makes use of three sensors within the slider and three corresponding precision tracks embedded in the main beam. As the slider moves it reads the position of the tracks under these sensors and calculates its current absolute position. This eliminates the need for having to reset the caliper first, and thus avoids the hassle associated with conventional digital calipers.
Measuring Technique for Calipers
Open the jaws slightly and then close them. Repeat this process a few times, making sure the display indicates zero every time, spending no more than a few seconds on this check. If the caliper is of the ABSOLUTE type, zero setting is not required. Nevertheless, to make certain it is operating correctly, close the jaws and check zero.
Now you are ready to take measurements. Make sure more than one measurement is taken because the first one tends to be poor. Continue to measure until the data starts to repeat itself. In the example shown here, this digital caliper starts to read the same data (73,88) after the third trial. You can disregard the first and second readings as incorrect.
Measuring with a caliper should take only three to four seconds. The caliper must be wiggled or aligned to find the right orientation against the work surface. Force must be lightly applied: touch the workpiece and back off, touch again and back off.
Theoretically speaking, if the jaws and the workpiece are oriented correctly, the correct result is given by the smallest reading within repeated trials; larger readings are due to misaligned jaws. Assuming that 84,73 is the correct result, an operator should be able to validate it by repeating the measurement. Soon, they will be convinced that 84,73 is the right answer and 84,75 is not, as to read values other than 84,73 becomes difficult after a few trials. If the caliper does not repeat then the operator is not using the same measuring force. For a hand tool such as this it is the handling technique that produces the correct result. Personal bias between operators may amount to as much as 50 μm (.002 inch). With practice it should come down to zero.
The measurement accuracy of the caliper, sometimes termed instrumental error, may be expressed in steps. The first step for the 0-200 mm (0-8 inch) range shows a measurement uncertainty of ±0,02 mm (±.000 8 inch) over this range (when set to zero with the jaws closed) which increases by around ±0,01 mm (±.000 4 inch) for each additional 200 mm in range thereafter. This trend continues all the way up to 1000 mm.
However, this reduction in accuracy can be avoided if gauge blocks are used to set a dial or digital caliper to the value (or close to the value) of the dimension being measured, such as 150 mm, as shown in the above example. By setting to gauge blocks in this way the inaccuracy of the caliper up to that point is, in effect, cancelled out and the caliper is now no longer off by 0,03 mm plus or minus. It now reads 150.00 at 150 mm, so you would be justified in quoting an uncertainty (k=2) of ±0,01 mm for a measurement around this value. In the case of the vernier caliper, as this type cannot normally be adjusted, a calibration correction value would be noted and applied to subsequent measurements to achieve the same result.
For setting to larger values, the setting standard bars for use with micrometers can be used if long gauge blocks are not readily available. For example, a 300 mm standard bar is guaranteed to be within ±0,007 mm (±.000 25 inch for a 12 inch bar) of nominal size as bought, which is accurate enough to calibrate any 300 mm caliper. These bars also carry a calibration correction value (marked on one of the insulators) to the nearest micrometre for adding to the nominal length when maximum accuracy is required.