Who, what, where
By Nathan Naveh
Nathan is President of Compucraft Ltd., a company based in Israel, compucraftltd.com
Edited by Leslie Gordon leslie.gordon@penton.com

Automation in the sense of computer- driven machinery has long been used either inside programmable controllers or PCs running inside machines, usually with dedicated displays. Yet, manufacturers often regard the king of automation — the industrial robot — as a “costly and complex” solution. Why?

One reason is many plant managers consider an in-house robot engineer too costly, so the general feeling seems to be “robotics is not for us.” In a way, this makes sense because even programming machines such as CNCs requires trained and dedicated high-level workers. Most plants can’t afford too many employees like this, if any. On their part, robot manufacturers have failed to provide simple ways to direct robots. Robot programming has remained obscure, complex, costly, and is thus thought to belong only to big-name OEMs.

Several robotics-software programs tried to overcome these limitations, providing offline programming and simulation tools. There are even programs that link to a real robot controller and provide realistic robot visualization. However, these programs themselves fall victim to the same difficulties. They are often too hard to understand, require long training, and cost between tens to hundreds of thousands of dollars. While the software does help automobile makers and shipyards, little of it has targeted medium and small companies.

It’s easy see why robotics is thought of as being too complex. Users must deal with positions of objects in space, expressed numerically in an unfamiliar way. For example, most individuals would say, “A coffee mug is on my desk” and not “The mug is located at X = 2,100, Y = 1,450, Z = 770, RX = 0, RY = 0, RZ = 34.” Even the trivial task of picking the mug up from the desk and bringing it to your lips comprises motion in space that takes many point locations. Close to the desk, the cup undergoes more of a location change because it gets raised with just a slight tilt. In contrast, closer to the mouth, the cup undergoes more of an orientation change because it gets tilted toward the lips, but with almost no raising.

Fortunately, software such as RobotWorks that runs inside SolidWorks lets designers see what the robot will do and also determine where and how the part should be presented or fixed for manufacturing. This lets engineers design for manufacturing in CAD with robotics in mind.

A practical example comes from an oil pan for a large diesel engine made of deep-drawn sheet metal. During production, silicone must be dispensed along its entire rim to later become a seal. This is done manually by a worker holding a dispenser, who moves the nozzle along the rim. Since the path is 3D and the nozzle angle relative to the metal must be fixed, the success of this operation requires a steady hand and good eye-to-wrist contact to maintain a quality job.

Many quality problems crop up in this scenario. For example, the silicone bead thickness is a product of pressure, angle of nozzle, height above the metal, and the hand speed, so it is not feasible to maintain consistent results between operators, shifts, and training level. Also, the job is boring, repetitive, and tiring, hence quality drops with time. This job has robotics written all over it.

Teaching a robot manually
One way to tell a robot how to move is by “teaching a point on the robot” or programming the robot by hand. This means moving the robot in its six axes (or less) and when reaching the needed posture, recording that point. A brief discussion of robot axes will help make this process clear.

The position of any object in space has six dimensions or degrees of freedom (DOF) and is related to a reference point on the object, relative to another coordinate system (for instance, the world). For example, to describe the location of a pencil numerically, the answer is made up of location (for instance, the pencil tip is 500, 600, 200 in the world XYZ) and orientation (for instance, the pencil body is rotated around the world X by m degrees and around the world Y by q degrees). The sixth dimension comes about because an object, located at a fixed point in space, and inclined by a known angle to two principal axes, can also turn around its own axis.

Teaching the robot path around the oil pan might take an experienced robot engineer between 3 hr and two days. At each point, the engineer must stick his head under the robot, eyeball the distance between the nozzle and the part, and estimate the straight line between the current location and the previous point. But once the robot moves away from the previous point, there is nothing to show where it was.

Worse yet, a robot has at least three different coordinate systems. Think about the robot world as having a coordinate system like a SolidWorks assembly. The robot tool, like a part in an assembly, also has its own coordinate system. In addition, each robot joint or part of the arm can rotate around its own axis. An engineer under the arm trying to move the tool in World Z might make a mistake and instead move the tool in Tool Z. Such a mistake could break expensive tools such as cameras, lasers, or probes, damage parts and break fixtures, or worse yet, cause bodily injury. Also, the robot is not in production for as long as it takes to collect the points.

Defining robot paths in CAD
The solution can be summed in a nutshell: Let the part drive the robot. In other words, a part designed in SolidWorks can include features used to create robot paths. For example, during part design, the engineer might add a chamfer along the oil pan rim, not necessary for the design per se, but for the sole purpose of becoming a robot path. With the robotics software running inside SolidWorks, this operation is a matter of selecting the chamfer with three clicks. The entire calculation of the path in this case would take about 10 sec, and is as accurate as the part itself.

A caveat is this maneuver might not constitute the whole job. Users might need to add application-specific information such as pressure and speed, but the main task is always to get the points. In fact, this is usually 70 to 80% of the whole job.

The generated path can now run in SolidWorks so designers can see potential collisions, verify robot reach, and add or modify fixtures. In fact, engineers can design an entire work cell in the software. In the oil pan example, the software shows the robot and oil pan without a workbench or any fixtures. That’s because the idea is to concentrate on the application. First find where the part should be and perhaps the robot to do the job. The robotics program then dictates work-cell elements and the best fixture location.

A chamfer along the edge of the oil pan (shown in purple) was added for the sole purpose of becoming a robot path.