56.2 Literature Survey and Commercial Solutions 

The presence of errors and uncertainties in the robotic work cell poses a great challenge to the effective implementation of off-line robot programming. These errors and uncertainties can be due to a number of different error sources such as manufacturing defects, inertial loading, encoder errors, calibration errors, and mechanical wear within the robotic manipulator. All these error sources contribute to a deviation between the nominal base-tool transformation 1TtN  obtained via the nominal kinematic model of the robot and the actual base-tool transformation 1TtA  as illustrated in Fig. 1. The errors in Fig. 1 have been deliberately exaggerated to illustrate the difference in kinematic loops between the nominal and actual work cell. Another major source of errors is due to alignment and datuming issues between the robot base and the work object which can be exacerbated by additional motion systems, such as turn tables to move the work object or tracks and gantry systems to move the robot, within the work cell. This will result a deviation between the nominal and actual robot base-work object transformation, represented by 1T0N  and 1T0A, respectively, as illustrated in Fig. 1.



Fig. 1 Nominal and actual transformation matrices

One approach for error compensation is to improve the kinematic model by accurately modeling the physical and dynamic properties of these error sources. Subsequently, the parameters within the model can be identified through the use of calibration methods which utilizes measurement pairs of the robot end-effector pose and robot joint angles. This approach has been proposed and developed in both the academia (Low 2006; Veitschegger and Wu 1988; Zhuang et al. 1994; Yang et al. 2002; Zhao et al. 2006; Chen et al. 2008; Mustafa et al. 2010; He et al. 2010) and commercially in the form of Calibware™ from ABB, ROCAL™ from Nikon Metrology, and DynaCal™ from Dynalog.
Recently, a number of related works in the academia have focused on cost-effective methods to obtain data for robot calibration. In (Ha 2008), measurements from a laser point distance sensor mounted on the robot end effector and a grid plate placed in the robot work cell are used to determine the position of the robot end effector for calibration purposes. A laser line scanner attached to the end effector and an artifact positioned in the robot work cell are used to conduct robot calibration in (Wang et al. 2009). In (Gatla et al. 2007), robot calibration is performed using cameras and a laser pointer attached to the robot end effector to obtain the required measurement information.
Robot calibration solutions are also available commercially including, but not limited to, Absolute Accuracy from ABB (Low 2006; Gunnarsson et al. 2006), MotoCal (2011; MotoCalV EG 2011) from Yaskawa Motoman Robotics, and DynaCal™ (Cheng 2007) from Dynalog.
The Absolute Accuracy from ABB is a robot option that is only available for ABB robots. A unique robot kinematic model is identified for each robot through the use of a PC-based software, Calibware™. The software is used with a 3D measurement system to obtain up to 100 robot end-effector poses that are distributed throughout the robot workspace. These measurement information are then used in Calibware™ to fine-tune up to 40 parameters within the robot model (Low 2006; Gunnarsson et al. 2006) to better reflect the actual robot behavior. Yaskawa Motoman Robotics also offers a similar calibration solution for their robots known as MotoCal (2011; MotoCalV EG 2011) with the purpose of increasing robot accuracy and facilitating the replication of robot work cell.
A similar approach is adopted in the DynaCal™ (Cheng 2007) system from Dynalog. In addition to the calibration software, the company also produces a measurement system, the CompuGauge™ Robot Measurement and Performance Analysis System. This measurement system consists of four extensible measurement cables that are attached to a single point on the robot end effector. The end-effector position of the robot is then determined using optical encoders to measure the cable lengths. In comparison with the ABB Absolute Accuracy robot option which is only available for ABB robots, DynaCal™ can be used with industrial robots from different robot manufacturers.
Instead of modeling and compensating all the errors directly, an alternative solution can be achieved through the use of measurement systems to identify the relative poses of the robot tool center and work object in the sensor frame. If this measurement is accurate, the end-effector tool center point can be guided to follow the desired path on the work object, effectively reproducing the results in the virtual work cell and indirectly compensating for all the error sources within the robotic manipulator and the work cell. Some commercial solutions adopting this approach are the Adaptive Robot Control system from Nikon Metrology (Adaptive robot control), vision (True View 5.12™ Vision Guided Robotics 2009), and probing systems.
In the Adaptive Robot Control system, an optical CMM consisting of three cameras is used to capture LEDs mounted on the work object and the robot end effector. The pose of the work object relative to the tool frame is established through the positional information of the LEDs obtained from the optical coordinate-measuring machine (CMM). With this information, the system guides the robot to move the tool through the desired waypoints on the work object. According to the manufacturer, this system can achieve an accuracy of up to 200 μm.
Vision systems are commonly used with robotic systems to establish the relative pose of the work objects to the robot system. These systems are generally limited to pick-and-place operations that do not require high levels of accuracy. For example, ABB’s vision systems such as TrueView (2009) and PickMaster integrated with ABB robots have an accuracy of 0.5–1 mm.
Adept Technology provides a machine-vision processor, known as Adept SmartVision™ EX (Adept SmartVision™ EX), that can be used for tracking and robot guidance applications. The machine-vision system uses a camera for data acquisition and can be integrated with Adept’s systems or act as a standalone vision inspection system.
FANUC robotics has developed its own vision systems, known as integrated Robot Vision (iRVision introduction; iRVision) (iRVision), which can be integrated with their robot controller. The vision system can be used as a two-dimensional (2D) vision system incorporating a single camera for pick-and-place operations on a planar surface. Alternatively, it can be used to capture 3D information by augmenting the camera system with a laser line scanner (iRVision introduction).
Kawasaki Heavy Industries also offers similar vision systems (2012) that can be integrated with their robots. Similar to the iRVision from FANUC, the Kawasaki vision system can be used to capture and process both 2D information, using a single camera, and 3D information which can be obtained from either a stereo vision system, 3D laser sensor, or laser slit scan camera. In addition to vision systems that are developed by the robot manufacturers, third-party vision systems (Cognex vision systems in automation 2014) can also be integrated with robots to provide visual guidance.
As an alternative to vision systems, contact-based sensors (White paper: Survival of the fittest – the process control imperative 2011) such as touch probes can be mounted on the robot end effector to establish the relative pose of the tool and work object. In this approach, the robot moves the probe to a few defined points around the work object and continue moving the probe until contact is registered by the probe. This allows the robot to identify the relative positions of identified features on the work object, thus establishing the relative pose of the work object to the robot base frame. This approach is dependent on using the robot as a measurement device and the resulting accuracy is affected by the accuracy of the robot. In addition, this approach requires the work object to have distinct surfaces or features that can be probed in order to establish the relative poses between the work object and the robot base or tool frames.
A similar solution was developed by ABB specifically for arc welding operations and was reported in a white paper (Sensor based adaptive arc welding ABB AB). Instead of using a touch sensor, the tip of the weld torch was used to register contact between the robot and work object. A similar feature is also available for FANUC robots, which is known as touch sensing, to compensate for errors in work object placement. Similar to the contact-based sensors, these solutions are limited to work objects with distinct features that can be used for localization such as planar surfaces. In addition to touch sensing, FANUC robots can also be equipped with the ability to compensate for work object warping under the high heat generated during welding. This feature, known as through arc seam tracking (Through arc seam tracking 2005), tracks the welding current during the weaving motion along the weld seam to determine the center of the weld seam. Subsequently, the system moves the robot accordingly so as to maintain the weaving motion about the center of the weld seam.
In addition to sensor-based systems, a calibration procedure that combines the human-in-the-loop compensation of online programming and path generation from off-line programming is developed by ABB (Machining PowerPac 2010) where five distinct points on the work object were defined off-line and the operator moves the robot to these same five points in the actual work cell, establishing the relationship between the virtual and actual work cell and compensating for errors around the local region of the five distinct points. This process is similar to probing and suffers from the same disadvantages. In addition, this is a manual and time-consuming process.