Written by: Albert Chen & Helena Sieh
Background:
In this section, we are going to set up an environment in Isaac Sim where we can test the robot programs with people inside the simulation.
Simple Grid:
This simple environment contains a flat ground and sides with a grid texture. Three configurations are provided; the first two have square corners, the third has curved corners.
Assets Path: omniverse://localhost/NVIDIA/Assets/Isaac/4.2/Isaac/Environments/Grid/
There are many pre-existing environments that Omniverse provides for Isaac Sim.
For our simulation, we decided to use the office environment, shown on the slide above.
Go to the files tab and select the USD file below:
Asset Path: omniverse://localhost/NVIDIA/Assets/Isaac/4.2/Isaac/Environments/Office/office.usd
Background:
In this section, we will be creating people in our simulation to help us test the robot programs instead of having to do it in real life to be more efficient.
Open the extension manager - Window > Extensions. In the extension manager, search for “people” and enable the omni.anim.people extension.
Omni.Anim.People is an Extension for simulating human events in environments such as retail stores, warehouses, and traffic intersections.
Load the People Simulation UI by navigating to Window > People Simulation.
In the People Simulation UI, copy and paste the following text in the Command Text Box.
Click the Load Characters button to load the characters assets and animations required for the command.
Click on Setup Characters button to attach Behavior Scripts and Animation Graph to the characters.
Turn off the Navmesh Based Navigation setting and click Play to run the simulation.
NOTE: Navmesh Based Navigation needs to be turned on for static obstacle avoidance. However, it requires a NavMesh to be built for the stage. Here, it gets off just to show how you can get the simulation running.
How to run the LookAround Program:
LookAround makes the character stand in the same spot, while moving its head from left to right. LookAround takes a duration value and performs the action for that duration.
Command Structure:
character_name LookAround duration
Example:
Character_02 LookAround 10
Stakeholders
[Name]
[Name]
[Name]
[Name]
Toyota HSR repositories:
Clone the hsr_description repository into ~/catkin_ws_hsr/src.
Clone hsr_meshes into hsr_description/robots/.
Build the workspace through catkin_make.
Enable the omni.importer.urdf extension in Omniverse Isaac Sim if it is not automatically loaded by going to Window-> Extensions and enable omni.importer.urdf.
Accessed the URDF extension by going to the Isaac Utils -> Workflows -> URDF Importer menu.
Check the box next to Create Physics Scene only, and uncheck Fix Base Link.
Set Stage Units Per Meters to 1.0, which means the asset will be imported in meters
Set the joint drive type to Velocity
Set Joint Drive Strength and Joint Position Drive Damping to 10000000.0 and 100000.0
Set the Output Directory to the place where you store your assets (nucleus or Local)
In the file Input File box under the Import tab, navigate to the /home/users/mavric-hsr/catkin_ws_hsr/src/hsr_description/robots, and select the URDF file hsrb4s.urdf.
Click the Import button to add the robot to the stage.
ADD ROBOT MOTION
This section will give you a tutorial of how to move the robot by manipulating the joints of the robot in the NVIDIA Isaac Sim environment.
First, find the Xform named wrist_roll_link and open it.
Right click on the joint named hand_palm_joint and press deactivate.
This will remove a faulty joint that will cause errors in the future.
If you click play right now, the left hand will fall apart, but this is completely normal.
Next, find the Xform named base_roll_link and find the two joints base_l_drive_wheel_joint and base_r_drive_wheel_joint.
Then, for each joint, click on the joint and navigate the property tab at the bottom.
Scroll down to Drive tab, and find the Target Velocity input boxes.
Set this value to any value, but for this example we will set it to 100.0.
Make sure to set the velocity for both joints!
Finally, click play and see the robot move!
If you want to change the speed or direction, simply change the values in the joints.
Integrate with ROS
To integrate with ROS, first Open Visual Scripting: Window > Visual Scripting > Action Graph. An Action Graph window will appear on the bottom, you can dock it wherever that’s convenient.
Then, Click on the New Action Graph Icon in middle of the Action Graph Window.
Inside the Action Graph window, there is a panel on the left hand side with all the OmniGraph Nodes (or OG nodes). All ROS related OG nodes are listed under Isaac Ros. You can also search for nodes by name. To place node into the graph, simply drag it from the node list into the graph window.
Build a graph that matches the one below. Note for the Make Array node, use the +/- buttons in the property tab to add additional inputs.
In ROS Subscribe Twist Node: Specify the Rostopic's name "/cmd_vel" in the topicName field in its Property Tab
In Differential Controller Node: Set Max Linear Speed to be 0.22, Wheel Distance to be 0.16, and Wheel Radius to be 0.025.
In Articulation Controller Node: Under the Inputs section, set targetPrim to the robot's location, or /Root/hsrb_02/base_footprint
For the two Constant Token Nodes, set one to base_l_drive_wheel_joint and the other to base_r_drive_wheel_joint.
When you publish to the /cmd_vel rostopic, you'll be able to control the robot!
This tutorial introduces how to use a LIDAR for sensing an environment in Omniverse Isaac Sim. After this tutorial, you will know how to add a LIDAR sensor to the scene, activate, and detect objects in the simulation.
Before you add a LiDAR object in the simulation, make sure to click on the location where you want to add the LiDAR onto the robot.
To do this go to the Stage tab on the right side of the app. There should be a list of everything that exists in the simulation.
Now click on any object that you want the LiDAR to be attached to. In this case, add the LiDAR to the base of the robot.
To create a LIDAR, go to the top Menu Bar and Click Create > Isaac > Sensors > RTX Lidar > Rotating. The LIDAR prim will be created as a child of the selected prim.
Rename the prim from "Rotating" to "Lidar"
CONNECT LIDAR TO ROS
In the same Action Graph made for driving, add the following nodes and connect them as shown.
For the Isaac Create Render Product Node, set the input camera target prim to the Lidar path, /Root/hsrb_02/base_range_sensor_link/Lidar
In each ROS1 RTX Lidar Helper Node: rename the frameid to "Lidar"
For one ROS1 RTX Lidar Helper Node: change the type to "point_cloud" and rename the topicName to point_cloud.
These should be the two nodes.
Open rviz through "rosrun rviz rviz" and set the fixed frame name in Isaac Sim to Lidar
Add PointCloud2 and set the topic to "/point_cloud"
Add LaserScan and set the topic to "/scan"
After hitting play, both the point clouds and laser scans should appear!
RGBD Camera
First, create a camera through Create->Camera
Then, move the Camera object under head_rgbd_sensor_link and rename it RGBD_camera
Next, in the Action graph create the following setup: (Ignore the ROS1 Subscribe Twist; That was from a previous setup)
Next, for the Isaac Create Render Product Node: Set the cameraPrim to /Root/hsrb_02/head_rgbd_sensor_link/RGBD_camera, the camera's path.
For the two ROS1 Camera Helper Node, you should set both to have a frameid "hsr".
For one ROS1 Camera Helper Node, you should set the topicName to "rgb" and the type "rgb" as well.
For the other, set the topicName to "depth" and the type to be "depth" as well.
These should be the two camera nodes.
Next, open RQT to visualize the two pictures.
Open Plugins->Visualization->Image View twice, and for one image select the topic /rgb and the other /depth. To better visualize the depth component of the camera, you can set the color from "gray" to "Hot".
Congrats! You should have fully set up the RGBD Camera.
TF Publisher:
Assuming you’ve already gone through the ROS camera tutorial and have two cameras on stage already, add those cameras to a TF tree, so that they can track the camera’s position in the global frame.
In a new or existing Action Graph window, add a ROS1 Publish Transform Tree node, and connect it up with On Playback Tick and Isaac Read Simulation Time, like the image below.
In the Property tab for the ROS1 Publish Transform Tree node, add the camera and Lidar sensor prims to the targetPrims field: /World/Camera_1 and /World/turtlebot3_burger/base_scan/Lidar.
Examine the transform tree in a ROS-enabled terminal: rostopic echo /tf. Verify that both cameras are in the TF tree. Move the camera or Lidar around inside the viewport and see how the camera’s pose changes.
To get the transforms of each linkage on an articulated robot, add the robot’s articulation root to the targetPrims field. All the linkages subsequent to the articulation root are published automatically. Add /World/turtlebot3_burger to the targetPrims field. Verify that the transforms of all the links of the robot, fixed or articulated, are published on the /tf topic.
Publish Relative Transforms:
By default, the transforms are in reference to the world frame. You can check that the /base_link transform of the TurtleBot is published relative to the /World. If you want to get the transforms relative to something else, such as a camera, make sure to indicate that in the parentPrim field.
For this scene, make all prims in the TurtleBot the child frames of the base_link frame. To do so, add /World/turtlebot3_burger/base_link in the parentPrim field. Stop and Play the simulation between property changes, to verify that the /base_link is the parent frame for all other robot frames and camera frames.
Odometry Publisher:
To setup the Odometry publisher, compose an Action Graph that matches the following image:
In the Property tab for the Isaac Compute Odometry Node, add the TurtleBot to its Chassis Prim input field. This node calculates the position of the robot. Its output is fed into both a publisher for the /odom topic, and a Raw TF publisher that publishes the singular transform from /odom frame to /base_link frame.
Run rosrun rqt_tf_tree rqt_tf_tree to get a graph of the constructed TF tree.
Enabling the ROS2 Bridge on Isaac Sim:
Before opening up Isaac Sim, run these commands on the terminal:
export RMW_IMPLEMENTATION=rmw_fastrtps_cpp
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/mavric-wonse/Downloads/omniverse-launcher-linux.AppImage/exts/omni.isaac.ros2_bridge/humble/lib