Don Day
Tec-Ease Inc.
Cherry Creek, N.Y.

Martin Raines,
Ken Swift
CapraTechnology Ltd. U.K.

A PCT-based program called Tolerance Capability Expert helped engineers redesign this bobbin used in a fuel injector so it could be eliminated from the tolerance stack. The design changes slashed scrap rate, saving the manufacturer over $5 million annually.

A PCT-based program called Tolerance Capability Expert helped engineers redesign this bobbin used in a fuel injector so it could be eliminated from the tolerance stack. The design changes slashed scrap rate, saving the manufacturer over $5 million annually.


Quality costs may consume 25% of total revenues in manufacturing businesses, mostly from rework, scrap, warranty, product-liability claims, and recalls. The Six Sigma quality initiative helps companies compare design tolerances with process variation based on historical data. In fact, many tolerance-analysis software packages include this feature, though most are not user friendly and need an expert to operate.

But a methodology called (PCT) simplifies the job. PCT helps engineers design components and products that are robust to process variation. It sets process-capability targets early in the design phase and predicts the success at reaching them, minimizing quality-failure costs later on. PCT uses a process-capability index, Cp, to compare allowable tolerance deviation with process variation, which is a measure of process precision:

where = design spec width and = the standard deviation of a population. Standard deviation is a measure of the dispersion or variation in a distribution. An example should help clarify the concept.

A machine-shop manager wants to quantify the capability of a CNC turret lathe at cutting external diameters of various sizes. The lathe typically does short runs, so he decides to measure critical, external diameters of parts in the range from 20 to 30 mm, and chart the deviation.

This OD turning process (as do most manufacturing processes) approximates a normal distribution or bell-shaped curve. Here, about 68.7% of parts are within ± from the average, 95% within ±2 , and 99.73% within ±3 of the average. In this case, = 0.005 mm, so:

Cp = 0.01 / (6 x 0.005) = 0.33

In other words, roughly 68% of parts would be in tolerance, 16% would be oversize (could possibly be reworked), and about 16% would be undersize (scrap). Had the design engineer relaxed tolerances to ±0.01 mm or ±2 , then Cp = 0.67, and about 95% of the parts would be in tolerance. Taking it a step further, setting tolerance to ±0.015 mm (±3 ,Cp =1) would yield roughly 99.73% in-spec parts or about three bad parts per 1,000.

Which begs the question: When is a design considered robust enough? Historically, Cp = 1 (±3 ) was an acceptable target, though most manufacturers today aim for ±6 . For the design in the above example to meet ±6 standards, it would need to function properly when critical diameters are held to within ±0.03 mm, or ±6 X the standard deviation. This, of course, assumes the process remains centered about the average, which isn't always the case. Tools wear, machines heat up and cool down, and raw material may not be consistent from batch to batch.

An index called Cpk quantifies these variations. It describes the float or drift of the distribution of part dimensions relative to design specs. Cpk is an indicator of how close the process mean is to the nearest spec limit, and compares that distance to the dispersion of parts about the mean, which is a measure of process accuracy. In statistical terms:

Cpk = x/3

where x = the difference between the nearest spec limit and process mean. The numerator is controlled by the tolerance limits of the design, while drift and variation in the manufacturing process influence the denominator. As such, Cpk may be used to predict the probability of defects.

Some design engineers claim the function of a given design sets tolerances. Actually, it is the selected design alternative that sets tolerances. Some alternatives are simply more robust than others. In general, designs should use the largest possible tolerances consistent with proper function.

In the turret-lathe example, had the tolerance been specified as ±0.03 mm (±6 ), and the mean of the process stayed within 0.0075 mm (1.5 ) of center, then:

Cpk = (0.03 - 0.0075)/(3 x 0.005) = 1.5

Converting Cpk to a "Z" score and a quick check of probability tables show the likelihood of producing defective parts would be about 3.4 ppm or less.

In practice, the selection of tolerances mostly relies on data found in handbooks, guesses, and experience of the designer. But none of these methods mesh with quality initiatives in today's world-class companies. Tolerance data found in handbooks, for example, are frequently based on 3 standards and have not been revised to reflect modern quality initiatives.

Overly tight tolerances associated with "seat-ofthepants" engineering design and outmoded standards can cause headaches for manufacturing that design may be unaware of. One reason: Quality and manufacturing departments seldom share process data with design departments. Crunching process data into a form useful for designers can be a daunting task. The problem is compounded when suppliers get involved. Suppliers typically submit preproduction samples, though these parts are rarely representative of production versions.

PCT assumes each process needed to complete an "ideally designed" part — regardless if it is done in-house or outsourced — has associated with it a level of inherent variability that is consistent with "good" manufacturing practice. Processcapability maps plot tolerances for an ideal design against characteristic dimensions, and are key to quantifying Cpk.

PCT now incorporates some 70 maps covering a variety of processes, from sand casting to honing. In addition, the analysis shows how factors including part geometry and material selection influence process variability. Machinability or formability ratings of materials, and the location of parting lines or long unsupported sections in castings, are just some of the data employed in an assessment.

Designers should also consider the severity of potential failures. Here, a technique called Failure Mode and Effects Analysis (FMEA) assigns a Severity Rating and maps it with occurrence probability and Cpk. These so-called conformability maps serve as a risk-assessment tool.

PCT IN ACTION
A program called Tolerance Capability Expert (TCE) incorporates PCT principles and lets designers allocate processcapable limits to design dimensions. The expert system runs on a Windows PC in either stand-alone mode or within a suite of engineering tools. The expert itself is coded in C++, whose object-oriented architecture eases expansion of the manufacturing-process "library." An analysis begins with the selection of a process and design characteristic and the determination of the "ideal" Cpk value for a given tolerance. A wizard guides users through checks of the effects of design geometry and material type on process capability. The TCE analysis of the tolerance stack for a fuel-injector solenoid assembly provides an example.

In the original design, the bobbin touches the solenoid body and is part of the tolerance stack, which is comprised of six dimensions and tolerances. A plunger-displacement spec of 0.8 ± 0.2 mm, means the ±0.2 mm must be budgeted or shared by six dimensions. Five of the dimensions/tolerances are in the control of the solenoid maker. The sixth set is associated with the fuel port to which the solenoid mounts, and therefore is fixed by the supplier.

In the redesign, the magnetic pole is enlarged. It is molded into and passes through the molded-over bobbin. The magnetic pole touches the body directly and the bobbin floats (clears the body), so it is no longer in the tolerance stack, cutting the number of dimensions to five. In other words, the ±0.2 mm plunger displacement tolerance is now budgeted or shared by just five dimensions, allowing more generous tolerances to those dimensions whose predicted Cpk(P) was not up to their target Cpk(T) in the original design. The redesigned tube did not need the ±0.025 mm tolerance to meet its target Cpk(T), so it had its tolerance lowered to ±0.015 mm, making available the remaining ±0.01 mm tolerance for other parts in the stack.

The bottom line: Low Cpk values of the original design meant high failure (scrap) rates, the cost of which was calculated at over $6 million annually against product sales revenues of around $16 million. The redesigned assembly has a calculated failure cost of just $6,000 annually. The additional machining needed for the redesigned body was not significant in the calculations.

 

TCE ANALYSIS OF FUEL-INJECTOR SOLENOID ASSEMBLY
NO.
DESCRIPTION
PROCESS (material)
DIMENSION
±TOLERANCE
CPK(T)
CPK(P)
COMMENT
1
Bobbin
Injection mold (PBT plastic with insert)
22
0.035
1.38
0.05
Not capable
 
Same
No longer in tolerance stack
2
*O-ring
(Rubber)
3
Body
Impact extrusion (forming steel )
3
0.02
1.38
0.05
Not capable
 
Impact extrusion (forming steel then machined)
23
0.035
1.3
3.47
Capable
4
Plunger
Silicone rubber molded onto steel
28
0.05
1.38
0.19
Not capable
 
Same
28
0.08
1.38
1.44
Capable
5
Magnetic pole
Impact extrusion (forming steel)
8
0.02
1.38
0.05
Not capable
 
Machined (free-cutting steel)
6
0.02
1.38
3.4
Capable
6
*Spring
(Steel)
7
Tube
Deep drawn (brass)
0.2
0.025
1.38
3.62
Capable
 
Same
0.2
0.015
1.38
3.13
Capable
8 *Coil
Dimensions in mm * = Not in tolerance stack Original design Redesign