A recent simulation program lets designers consider the randomness that occurs in engineering values affecting product performance.
|Important variables for a particular analysis appear on the diagonal of a decision map. Despite many input variables, only five are most influential. Yellow circles represent inputs, while blue and red are outputs. Red connectors are for strong correlations and blue for weaker ones, but both are statistically significant. A line through the second yellow circle at top left goes to the right and down to output 2 (a red dot). It means input 1038 controls output 2.|
|Each run produces one point and represents results derived using input values selected by the Monte Carlo kernel. The most likely behavior occurs where points are most dense. The outliers, or points farthest away from the line, indicate potentially dangerous designs.|
MSC.Robust Design from MSC.Software, Santa Ana, Calif., shows designers what factors or values most affect performance of a design and identifies combinations of variables that can lead to failures.
For example, material properties, loading conditions, thicknesses, dimensions, and more all vary randomly, affecting the performance and function of a design. However, most engineers perform a computer test on a single model with few load conditions to obtain a single result. "In the real world, every value and feature of the model and its environment have combinations of variability and uncertainty that must be considered to understand possible outcomes," says Jacek Marczyk, chief scientist, stochastic simulation with MSC.Software. "Results from the software are derived from randomly varying values to consider uncertainty and give a more realistic picture of how products will perform in the real world. The software pinpoints the most influential variables and, equally important, tells what conditions are unimportant and can be ignored."
Traditional design methods handle uncertainty in models with safety factors. But this has lead to overengineering and excessive costs. Also, safety factors often leave something overlooked or unmodeled. It could be an unfortunate combination of factors and circumstances that lead to a catastrophic event. "The problem with safety factors is they provide no measure of the real safety levels in the structure," says Marczyk. "However, uncertainty can be taken into account in the same way it manifests itself in nature without relying upon safety factors. We can simulate reality and manage uncertainty within product development with stochastics analysis. The advantage of doing this hinges on a simple point -- models incorporating uncertainty are realistic and, as engineers, it is our duty to get as close as possible to reality before anything is brought to market."
Marczyk adds that optimal designs and robust designs are mutually exclusive concepts. "The optimal, safety-factor-laden designs may be conceived to be correct and manufactured, but they're not too healthy," he says. "They have a tendency to drift to states of lower performance. In practice, you will almost always get less than what you think you'll get when designing an optimal system. This is why it's a good idea to favor robustness and not optimality in complex engineering systems."
The quest for accuracy has also led some to think large models are better models. "Engineers often assume they are building in accuracy when they build a million-element model," says Marczyk. "But a Principle of Complexity suggests that the more interacting components in a system, the less precise statements can be made as to the system's performance and behavior. For example, uncertainty in materials or variations in loading conditions can exceed 20 or 30%. Adding more decimals or more computing horsepower won't significantly change the outcome."
The software does not restrict users to a small number of variables, as would be the case for design-of-experiment (DOE) methods. DOE users need three to four times the number of runs per variable. So if there were 40 design variables, users would run the solver 120 to 160 times to get a response surface. Stochastic simulations, on the other hand, handle thousands of design variables yet still need only about 100 iterations.
The stochastic software currently interfaces with MSC.Nastran and can perform analyses of structural systems. Soon the software will also be interfaced with all MSC.Software's solvers including MSC.Dytran, for short-duration dynamic events involving the interaction of fluids and structures, and other third-party software.
Users set up models almost the same way as in standard FEA studies but need to specify the statistical spread in engineering values of the variables under consideration. Monte Carlo techniques in the software randomly select properties and values from within prescribed ranges for each of the variables. Users can introduce tolerances and uncertainties on every entity in the model. For example, if there are hundreds of different materials in an assembly, common in a car, each could have stochastic properties. Each run produces one data point.
The method, however, rarely needs more than 100 analyses to calculate useful information. The noise in engineering data is usually so high that 100 iterations get sufficient precision and confidence in the output. One of the advantages of the Monte Carlo method is the cost to run the simulation is independent of the number of variables.
Rather than optimizing designs as some programs do, the module can be used to set goals. The software takes this approach when the user specifies required targets for calculated values.
The software draws up spreadsheets of calculated values, charts, a decision map, and a design improvement feature that let users design a system to target. Two of the spreadsheet columns are for maximum and minimum calculated values. "These describe the spread or quality of the performance. The narrower the spread, the better the design in terms of quality. An important result of a stochastic simulation is the most probable value of each output variable," says Marczyk.
Users can impose limits on these values. For instance, it might be necessary to reject designs producing a first frequency below a certain value. The results will then show what combinations of engineering values produce the unwanted value, and designers can work to avoid them. A decision map, another important output, presents the big picture. It pinpoints variables with the most influence, those with lesser influence, and those with none.
Marczyk says a two-day course should be sufficient to learn the software. "Although I have trained colleagues over the phone," he notes.
In addition, the software runs on almost any computer, from laptops to Linux clusters. "One hundred runs may take a weekend for small jobs on a laptop or in a matter of minutes on a large cluster," says Marczyk. Early adopters include major automotive OEMs in Europe and Asia. One reportedly trimmed as much as 16 kg off the mass of a car model through use of stochastic techniques. The reductions stem from minor adjustments on material thicknesses, not topology changes.