Search this site
Embedded Files
  • Home
  • Solar Energy
    • IECON2017
    • INTERCON2017
    • Thesis Research
    • Solar Energy
    • Solar Heater
  • Robotics
    • ROS2 Control
    • 3D Mapping
    • Starbots Coffee Shop Arm
    • Gtest & Unit Test
    • Robotics Web Development
    • Arms Kinematics
    • Mobile Kinematics
    • Perception & Manipulation
    • Autonomous Navigation
    • Obstacle Avoidance
    • Four Wheel Independent Steering
    • Turtlebot3
    • Gazebo Simulation
    • UGV
    • Sumo
    • Aurdiosity
    • Drone
  • Electronics
    • BLDC Motor Upgrade
    • NAS Server
    • Lenovo x250
    • Canon G3100
    • LG TV
    • Wasp Killer
  • Computer Vision
    • Tracking Algorithm Based on Yolact and SIFT
    • Yolo v5 Obstacle Detection
    • VR Simulator
    • Color Based Person Tracker
    • Discrete Cosine Transform
    • OpenGL
  • Courses & Certificates
 

Perception & Manipulation

The purpose of this project is to use Moveit2 to generate a basic Pick & Place task with a UR3e robotic arm. The task involves picking an object from the table and placing it in a different location. The Move Group Interface API is utilized to combine a series of motions to generate the pick & place sequence. Additionally, a perception node is integrated into the sequence to enable the robot to detect the position of the object to be picked.

Key Topics Learnt

  • Moveit2.

  • Perception.

Method


First the moveit assistant was used to setup the robotic arm, set joint limits, max velocities and so on... Then a cpp node was created to crontrol the arm through the moveit API.

  • Joints configuration - joints.yaml

First of all, the joint_limits.yaml file needs to be update, as you can see, all the gripper joints have a max_velocity parameter set to 100. Moveit doesn't like this value, as it is expecting a float type. Therefore, we will need to modify this value to 100.0.

File
  • Controller configuration - moveit_controllers.yaml

After running ros2 action list, we can see that the current Action Server that the robot runs to the one we got from your configuration are different. This means that when the MoveIt package tries to connect to the simulated robot to control it, it won't be able to do so since the Action names won't match.

File

In manipulation, when you want to pick an object from the environment, you don't send the end-effector directly to the pose of the object, as the gripper might end up colliding with it. Instead, you send the end-effector to a pose near the object. Then, you execute the approach motion in order to get close enoguh to the object to pick it. Once you have picked the object, you execute the retreat motion to go back to the previous position.

Full motion.

Approach position.

Retreat postion.

Destination.


To detect the coordinates of the cube a depth camera was used. When using a depth camera it is posible to obtain a pointcloud and then process the pointcloud using the PCL library to detect shapes.

Results

Two tests were done, in the first one the position of the cube was hard-coded in the code. Meanwhile in the second one the coordinates of the cube were obtained by using the depth camera and the perception node.

Pick up & Place.

Pick up & Place + Perception.

Tips

Files

  • GitHub

Google Sites
Report abuse
Page details
Page updated
Google Sites
Report abuse