Paul Sagar
President
Albion Devices Inc.
Solana Beach, Calif.

Automatic temperature-compensation systems are cost-effective ways to minimize thermal errors in precision dimensional measurements. They detect temperatures of key elements in a measurement system (the workpiece, gage, and setting master), calculate a thermal error, and output a correction signal. Applying the correction or compensation automatically to a precision gage lets measurements reflect true dimensions, as if all the elements are constantly at the international reference temperature of 68°F (20°C).

By international convention, all dimensions are measured at 68°F, unless otherwise specified. This was the subject of the very first ISO standard ever published: ISO 1. It is supported by every dimensional standards organization in the world including ANSI and NIST.

Any dimension measured at a temperature other than 68°F is affected by thermal expansion or contraction of the setting master, which is originally correlated at 68°F, the workpiece, and/or the gage itself. Sometimes thermal variations of the measuring system’s elements offset one another, but this is not usually the case. It depends on the elements’ materials and geometry. If a measurement system is supposed to produce repeatable readings for critical dimensions, temperature effects are probably significant and should be minimized. And temperature compensation is usually the most cost-effective solution.

Temperature compensation is needed when a measurement system is required to control a tolerance which is small compared to an overall dimension. The tighter the tolerance, the more significant thermal variations will be on long-term shop-floor gage repeatability and reproducibility (GR&R). For example, a thermal variation of 0.0004 in. in a diameter with a tolerance of ±0.005 in. represents only 4% of total tolerance and might be disregarded. If the part tolerance is ±0.0005 in., however, a thermal variation of 0.0004 in. represents 40% of tolerance. If gages are expected to repeat within 10%, this degree of variation becomes a matter of concern.

The amount of thermal variation is affected by the expansion coefficient of the material in use, geometry, and the overall dimension being measured. The more material there is in the part, gage, or master, the more material there will be to expand or contract.

Aluminum and its alloys have a coefficient roughly twice that of steels, which have coefficients slightly greater than iron. In general, the coefficients are roughly: 12.0 parts/million/° F (23.4 ppm/°C) for aluminum and its alloys; 6.8 ppm/°F (12.2 ppm/°C for steel and its alloys; and 5.6 ppm/°F (10.0 ppm/°C) for iron.

As a rule of thumb, temperature fluctuations significantly affect shop floor GR&R when total tolerances are specified to less than 1,000 ppm (or 1/1000th of dimension) for iron or steel workpieces, and 2,000 ppm (or 1/500th of dimension) for aluminum workpieces.

For example, a steel component measuring 3 in. in diameter with a tolerance of ±0.0005 has a total tolerance of 0.001 in., or 333 ppm, much less than the 1,000 ppm threshold referred to earlier. A typical shopfloor- temperature range over a year might be 30°F, that is, from 65 to 95°F. Over this temperature range the measured dimension will vary by 0.0006 in., or 60% of tolerance. If the part were made of aluminum, thermal expansion would be twice as much, and would account for more than 100% of the total tolerance.

Temperature-compensation systems sense the temperature of the workpiece and gage during measurement, and determine the temperature of the master when it is used to zero the gage. The system takes this information, along with coefficients and nominal dimensions, which are programmed into the systems during set up, to calculate a correction. The correction is output in real time to the host column or gaging system.

Operators select which coefficients they want to use during system set up. The device’s user manual, or technical and scientific handbooks, contain values for these coefficients. (Coefficients are always assumed to be linear, a safe assumption given the relatively narrow range of temperatures found on most shop floors.) For those who want more accurate results from their temperature- compensation system, they or their vendor can “characterize” the system. This lets users customize correction coefficients for specific applications.

Other approaches to eliminating the effects of temperature are available, but none are more cost effective or accurate than automatic temperature compensation. It can sense the temperature of each measuring system element to within ±0.5°F (± 0.3°C). They can correct for over 90% of thermal error, often correcting for 95% of error.

Using the previous example, if temperature caused a 3-in. diameter to vary by 0.0006 in., a gage with temperature compensation would show thermally induced variation of less than 50 millionths of an inch (approxmately one micron) over the entire temperature range. (Systems that attempt to compensate for only one or two of the elements will not reach this level of repeatability.)

An alternative to temperature compensation is temperature control, but it is more expensive to install and operate and usually less accurate. Only the most expensive air conditioning systems, such as those NASA or NIST might use, can maintain ±0.5°F control.

Benefits of temperature compensation
Gages display results as if workpiece, master, and gage were continuously at 68°F, the international reference temperature, regardless of their true temperatures.

Gages maintain repeatability and reproducibility while temperatures fluctuate.

Cpks or production equipment can be improved.

Gage correlation is enhanced, so measurements made at different times and places are in agreement.

Quality improvements reduce costs of rework, scrap, and warranty.

© 2010 Penton Media, Inc.