Robotics Overview

Outline

What is a robot and what do they do?

Robots are automated machines designed to complete tasks with minimal human interaction. Building and using robots involves knowledge from many different fields, including industrial design, mechatronics, computer science, electrical engineering, and more. Robots have a wide range of designs depending on the use case.


Modern Robots

  • Parallel Manipulator- A motion base parallel robot with a multi-axis system used in flight simulators, mechanical bulls, and satellite positioning

  • Robotic Arm- Robot arm used majorly in factories for assembling intricate parts and preparing dangerous chemicals

  • Collaborative(Cobot)- Robot arm capable of working with humans by completing repetitive tasks

  • Medical- Assists surgeons to more easily and safely manipulate instruments inside patient bodies

  • Mobile- Less common for industrial use, assists with tasks such as organization, transportation, and onboarding and offboarding


End Effectors


The "hand" of a robot on the end of the grasping manipulator that interacts with the environment to execute tasks

  • Gripper- The most common end effector used to grasp objects

  • Subtractive- Overarching term for end effectors that remove material from an object such as cutting or drilling

  • Welding Torches- Controlled efficiently by robot for precise and accurate welding

  • Collision Senors- Detects force of collisions to prevent damages to robot

  • Tool Changers- An interchangeable end effector that is capable of various tasks

  • Vacuum- Uses a vacuum suction to pick and place

Robots in Manufacturing

The majority of robots in the manufacturing industry are stationary robotic arms used for assembly, welding, bin picking, labelling, picking and placing, and much more. These robots are designed to make labor cheaper, safer, less repetitive, and more efficient.

The first appearance of robotics in manufacturing was the UNIMATE robot by General Motors in 1962, designed to assist in vehicle assembly and building. Ever since, this industrial tool has grown in popularity and performance.



Kinematics & Dynamics

Kinematics refers to the study of motion and dynamics defines the forces that produce it. Robotic motion is comprised of a system of links and joints that determine a robots range of motion and is quantified by a robots degrees of freedom(DOF). The majority of today's robots have six DOF, allowing them to control pitch, yaw, roll, surge, heave, and sway. Any higher DOF allows robots to move comparable to a human arm with an extra axis of rotation.


Joint Types

  • Revolute- A rotary bearing for pivoting

  • Prismatic- Provides linear movement through sliding

  • Screw- Enables single axis translation using threaded rod

  • Spherical- a ball and socket system that giving full rotational capability

  • Planar- Controls rigid movements of a robotic arms, one part remains still while the other moves through the plane

  • Cylindrical- A revolute joint with a prismatic joint for rotation and translation along an axis


Denavit-Hartenberg(DH) Parameters

DH Parameters quantify the relationship of link and joint variables to position and orientation. These parameters are link length, link twist, link offset, and joint angle to be used in equations of motion

  • Link length a- length between the final and initial z-axes

  • Link Twist θ- angle around z-axis between final and initial x-axes

  • Link Offset d-distance between final and initial x-axes

  • Joint Angle α- angle between final and initial z-axes


Inverse Kinematics is determining the necessary motion to complete a desired task using DH Parameters. This process is done by calculating the specific joint motions necessary, beginning with the end effector and ending with the entire completed motion. Calculating the Jacobian matrix of motion defines the transformation from task space to joint space.

(Left is Task Space, right is Joint Space)


These definitions of motion reveal the dynamic relationship between the joints and specific movements. The Jacobian uses partial differentiation with respect to each axis, acting as a transform to see what joint angle velocities will cause what specific end effector speeds.


Control

Operation Space Control is the method taking a Generalized coordinate system and planning its trajectory for a specific task. Model-based design (MBD) is used hand in hand with operation space control to establish framework for communication throughout an entire system.


Steps for MBD:

1. System modelling- Raw data is processed to find an ideal algorithm for modelling

2. Controller analysis and synthesis- The necessary motion is determined from the model for the controller to be synthesized accordingly

3. Simulation- Uses virtual and real life simulations to modify and perfect motion

4. Deployment- Tests the robot to make necessary changes with debugging if necessary


Controllers

Controllers changes the input into a particular system (for example the current to a robot motor) so that the system has a desired output. The Proportional Integral Derivative (PID) controller is the most commonly used, it utilizes a control loop mechanism that continuously calculates error and updates accordingly using these three control terms:

  • Proportion- Controls the output proportionally to the current error in the system

  • Integration- Accounts for error by continuously integrating it over time until it diminishes to a minimal amount

  • Derivation- Estimates future error trend using rate of change and applies control or damping as necessary



Planning

Modern robotics expects robots to plan and execute given tasks as fast and optimal as possible. This process is called task planning and can require the analysis of dynamics, control policies, manipulation, object recognition, and human interpretation all within milliseconds.

Task Planning Methods

  • KD Tree- Stores a grid map where binary partitions break off from the previous, developing a tree-like structure for multidimensional searches

  • Oct Tree- Hierarchical tree shaped bottom up spatial subdivision where each node partitions into 8 octants to store and index data easily

  • Branch on Need(BON) Tree- A top down implementation of Oct Trees that partition into regions of powers of 2

  • Forward State Planning- Flows from initial state looking for a final state that satisfies a goal

  • Backward State Planning- Flows from final goal achieving state back to previous sub-goal states to determine actions to achieve it


Path Planning

When given the task of maneuvering from one location to another in the most optimal path, there are various algorithms that can be executed. Some commonly used path planning algorithms:


  • Dijkstra Algorithm- Fixes to a single node as its source and slowly builds a tree to all other nodes in the graph until target is reached, follows path with lowest cost

  • A*- Given approximate direction by estimating cost to each node, follows path with lowest cost to each node

  • D*- Based on A* but reduces cost by estimating entire path cost before movement to avoid replanning

  • Rapidly Exploring Random Trees(RRT)- Explores by placing noed randomly until it reaches goal and follows first path found

  • RRT*- Improved RRT that adds nodes based on previously places nodes, picks an optimal path rather than first found

Sensing & Perception

Calibration


Calibration is imperative to an accurate and functional relationship between an end effector and its environment. The following must be calibrated for a functional relationship:

  • Joint Level- relationship between the input signal sent to the and the actual joint movement output

  • Entire Robot Kinematics Model- ideal kinematic structure of links and angles of joints

  • Non-Kinematic- positioning the end effector accounting for outside forces


Each calibration is then applied at each of the following steps:

1. Modeling- Choose the type of functional relationship

2. Measurement- Collect raw data from the robot to relate input to output

3. Identification- Use the raw data to calculate coefficients and expected error of the model

4. Correction- Execute the model with software and troubleshoot


Sensors

Robotic sensors enable robots to perceive their environment and objects within it, but can only interpret certain data types. Image registration is the process of transforming data in one coordinate system to another. This process permits the use of various data collecting devices to be interpreted by any system. Point set registration executes this coordinate transformation by merging various data sets into a generalized consistent model or categories.

Sensor Types


1. Encoder


A communication device that determines the motion of an operating device by converting joint position and speed into an electrical signal to be read by a control device. Robotic arm encoders can also determine position.

  • Linear Encoder- Detects motion along a path

  • Rotary Encoder- Detects rotational motion

2. Inertial

Measures acceleration and angular velocity based on x, y & z axes using the laws of physics.

  • Inertial Measurement Unit(IMU)- Calculates and reports angular rate, gravity, direction, and specific force

  • Accelerometer- Uses the derivative of speed to calculate acceleration of real time motion using AC output for dynamic motion and DC output for any motion

  • Gyroscope- Senses angular velocity to control direction and balance within a mobile device


3. Force Based


Designed to detect forces applied to robot body

  • Force & Torque Sensor: Detects motion from all axes to calculate force and torque

  • Capacitive Fingertip Sensors: Recognizes change in an electric field sensor to detect within its proximity and to feel

  • Load Cell: Measures pressure to calculate present load


4. Vision


Calculates absolute position of an array of digitized content using the integral of speed. Visual servoing can be conducted through reading light waves, sound waves, friction, and beacons to interpret objects shape, location, and even texture. Many industrial robotic grasping scenarios involve visual servoing to develop feature maps and grasping locations.

  • Depth Cameras: Projects constant infrared light on a plane to detect texture, size, and shape

  • Lidar Cameras: Projects series of light wave pulses and interprets reflected waves as 3D objects

  • Stereo Cameras: Extracts information from two cameras to recognize advanced 3D objects (similar to human eyes)



Modern Software, Tools, and Robots

*not an endorsement

Resources

Image Sources

Rotary Encoder Diagram. (n.d.). https://www.encoder.com/hs-fs/hubfs/_White-Papers/WP-2011_Basics/wp2011_parts-of-rotary-encoder_760x507.jpg?width=760&name=wp2011_parts-of-rotary-encoder_760x507.jpg.

Rrt Simulation. (n.d.). https://sites.psu.edu/zqy5086/files/2017/08/RRTsim-1nhzi69.png.

Pid Controller. (n.d.). https://mjwhite8119.github.io/Robots/assets/images/Control-Theory-Slides.006.jpeg.

Joint space vs. Task space robotic arm Gif. (n.d.). https://blogs.mathworks.com/racing-lounge/files/2019/11/cart_vs_joint.gif. Robotic joint types. (n.d.). https://d17h27t6h515a5.cloudfront.net/topher/2017/June/5935b661_l01-03-l-joint-types-and-degrees-of-freedom-01/l01-03-l-joint-types-and-degrees-of-freedom-01.png.

Roll, pitch, yaw, sway, surge, and heave diagram. (n.d.). https://www.calqlata.com/prodimages/Vessacc%202-3.png.

Industrial robotic arms on conveyor belt. (n.d.). https://images.tmcnet.com/tmc/misc/articles/image/2019-sep/2616173980-roboticmanu-1000x536.jpeg.

Kd Tree Diagram. (n.d.). https://www.researchgate.net/profile/Yacine_Benchaib/publication/300051910/figure/fig10/AS:598222171029511@1519638688569/Data-partitioning-with-a-k-dimensional-tree.png.

Lidar Sensor Waves Map. (n.d.). https://media.automotiveworld.com/app/uploads/2020/07/21101714/LiDAR-Reuters.jpg.

OctTree Diagram. (n.d.). https://ars.els-cdn.com/content/image/1-s2.0-S0924271616000022-gr1.jpg.

Rgb-D Camera Sensor Field. (n.d.). https://docs.microsoft.com/en-us/azure/kinect-dk/media/concepts/depth-camera-depth-ir.png.

Stereo Camera Description. (n.d.). https://www.adept.net.au/news/newsletter/201211-nov/Resources/fig1.jpg.

Cobot in action. (n.d.). https://thumbs.gfycat.com/MistyDimwittedGreyhounddog-max-1mb.gif.

Collision Sensor End Effector. (n.d.). https://pushcorp.com/wp-content/uploads/2020/06/afd-parallel-motion.gif.

Gripper End Effector. (n.d.). https://i.pinimg.com/originals/2c/25/80/2c258062e9e05313ced701306dbb0307.gif.

Medical Robot. (n.d.). https://consultqd.clevelandclinic.org/wp-content/uploads/sites/2/2018/10/SP-Port-Robot-1-650x450.gif.

Mobile Robot. (n.d.). https://thumbs.gfycat.com/PointlessAggressiveBluet-max-1mb.gif.

Parallel Manipulator range of motion. (n.d.). https://thumbs.gfycat.com/AnotherTimelyHoneybadger-max-1mb.gif.

Robotic arm range of motion. (n.d.). https://www.akinrobotics.com/en/endustriyel-is-robotlari/img/robot-arm-2/industrial-work-robot-arm-2.gif.

Subtractive End Effector. (n.d.). https://thumbs.gfycat.com/GlossySpecificAlligator-size_restricted.gif.

Tool Changer End Effector. (n.d.). https://ksr-ugc.imgix.net/assets/023/395/822/447a3bd720f3793b519bbfd16974bae3_original.gif?ixlib=rb-4.0.2&w=680&fit=max&v=1543309202&auto=format&gif-q=50&q=92&s=53c9e99c0fe17036f1c65b66fcf9be7e.

Vacuum Suction End Effector. (n.d.). https://ksr-ugc.imgix.net/assets/015/710/384/684dae0358c04a07b15badd49f75f68f_original.gif?ixlib=rb-4.0.2&w=680&fit=max&v=1488442988&auto=format&gif-q=50&q=92&s=4bb2b80bcff8b6a5089c61e5d2450a11.

Welding End Effectors. (n.d.). https://i.gifer.com/MlW3.gif.