Lab Development and Demonstrations

Lab Resource Development

ROS based differential drive robots for cooperative multi-agent experiments in a motion capture environment

(In collaboration with Sri Theja Vuppala)

These robots are simple four wheel differential drive type robot with a Raspberry Pi 3b computer, a L298N motor driver, and four N20 90 rpm motors. The robot is powered by a 12 V lithium ion battery (3 cell 2200 mAh) . The robots are setup with a static IP address and communicate over Wi-Fi network as a multi-computer ROS network. The robots shown below can be teleoperated with a joystick or can be run autonomously in a motion capture arena. The controllers for each robot are written as ROS nodes and can be run both onboard the robots or off-board from a central computing system. The video below gives some design details of the robot along with some experiments conducted in a Vicon arena.

Related publication:

  • A. V. Borkar and G. Chowdhary, "Multi-agent Aerial Monitoring of Moving Convoys using Elliptical Orbits," 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 8999-9005.

ROS enabled autonomous quadrotor for experiments in motion capture environment

(In collaboration with Vraj Parikh, Swaroop Hangal and Shoeb Ahmed )

This ROS enabled quadrotor was built for indoor multi agent experiments in a Vicon motion capture environment. The quadrotor has a flight time of up to 14 minutes and is equipped with a Pixhawk v1 flight controller and the Raspberry Pi 3B as a companion computer which runs the ROS-Node to control the drone. The localisation data from the Vicon system is communicated to the drone using a wi-fi network. A labeled photo of the quadrotor with its components is given below and the various stages of its development and testing are shown in the adjacent video.

Related publication: A. V. Borkar, S. Hangal, H. Arya, A. Sinha and L. Vachhani, "Reconfigurable formations of quadrotors on Lissajous curves for surveillance applications", European Journal of Control, Volume 56, November 2020, Pages 274-288.

Poster session: A. Borkar, S. Hangal, H. Arya and A. Sinha, "Reconfigurable Quadrotor Formations on Lissajous Curves." Poster presented at the 2nd Cyber-Physical Systems Symposium (CyPhySS), Indian Institute of Science, Bangalore, India, 11 July 2018.

ROS-Node for driving multiple Firebird V robots

(In collaboration with Vraj Parikh)

We made a python based ROS-Node to control multiple Firebird V robots in a Vicon motion capture environment. The ROS-Node takes number of robots as an input argument and opens command velocity topics in ROS for each robot. The velocity command published to these topics are then transmitted to the corresponding robots using Xbee Pro series 1 wireless modules (also mounted on the robots). Each robot Xbee is configured with a unique address, and the API packet format supported by the XBee firmware is used to communicate addressed packets to each robot.

Utilization of this system:

Courses: SC 626 Systems and Control Engineering Laboratory (Spring 2017)

Research publications:

  • A. V. Borkar, V. S. Borkar and A. Sinha, "Aerial monitoring of slow moving convoys using elliptical orbits", European Journal of Control, Volume 46, March 2019, Pages 90-102.

  • J. M. Monsingh, A. Sinha, "Trochoidal Patterns Generation using Generalized Consensus Strategy for Single-integrator Kinematic Agents", European Journal of Control, Volume 47, May 2019, Pages 84-92.

  • A. V. Borkar, A. Sinha, L. Vachhani and H. Arya, "Application of Lissajous curves in trajectory planning of multiple agents", Autonomous Robots, Volume 44, No. 2, 2020, Pages 233-250.

2D Localization system based on visual feedback from cameras

This is a simple visual feedback system developed using MATLAB for localization ground robots in a 2D arena for both lab courses and research experiments. The initial version comprised of a single web cam for localisation of robots in a 1 meter X 1 meter square area. The robots were equipped with unique colored patterns for easy identification. This system was later expanded to a four camera system for greater area. The four cameras were later replaced by a single wide-angle lens camera and instead of color patterns unique black and white identification patterns were used for identifying robots in the image processing code. The localisation data including the (x,y) position coordinates and the heading angle are transmitted to the robots using wireless Xbee network operating in the API mode that supports addressed communication packets. A block diagram of the system is shown below along with a video showing its different stages of development.

Utilization of this system:

Courses:

  • SC 626 Systems and Control Engineering Laboratory (Spring 2015, 2016)

  • SC 634 Introduction to Wheeled Mobile Robotics (Autumn 2015 and 2016)

  • SC 635 Advanced Topics in Mobile Robotics (Spring 2015 and 2016)

Research publications:

  • S. Kadam, K. Joshi, N. Gupta, P. Katdare and R. Banavar, "Trajectory tracking using motion primitives for the purcell's swimmer", Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, Canada, September 2017, Pages 3246-3251.

  • S. Agarwal, T. Tripathy, A. Borkar and A. Sinha, "Relative Heading based Pattern Generation", Proceedings of 20th IFAC World Congress, Toulouse, France, July 2017, Pages 12759-12764.

  • T. Tripathy, A. Sinha, H. Arya and A. Borkar, "Range Based Control Law to Generate Patterns With a Unicycle", Proceedings of 55th IEEE Conference on Decision and Control (CDC), Las Vegas, USA, December 2016, Pages 4979-4984.

  • A. Borkar, A. Sinha, L. Vachhani, and H. Arya, "Collision-free trajectory planning on Lissajous curves for repeated multi-agent coverage and target detection", Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, October 2016, Pages 1417-1422.

  • G. Chaudhury, A. Sinha, T. Tripathy and A. Borkar, "Conditions for Target Tracking with Range-only Information", Robotics and Autonomous Systems, vol. 75, 2016, Pages 176-186.

AR.Drone experiments for Control Systems Lab course

These are some of the experiments designed for the SC 626 Control Systems Lab course (Spring 2017), using the AR.Drone 2.0. The drone is controlled using the ardrone_autonomy package in ROS. Some simple control task implemented were: basic way-point navigation with fixed heading, way-point navigation treating the drone as a unicycle, following a virtual leader point moving on a circular orbit, elliptical orbit tracking using vector field guidance and tracking a moving ground target.

Hardware Demonstrations

Target tracking with an AR.Drone 2.0. using on-board camera

(In collaboration with Aniket Adsule and Swaroop Hangal)

In this demonstration the AR.Drone 2.0 tracks a wheeled robot moving on the ground, by detecting the roundel pattern mounted on top of the robot, using its bottom facing camera. The flight code is implemented using the ardrone_autonomy package with ROS Indigo Igloo. The differential drive robot is controlled using a Raspberry Pi 3B computer using a ROS node running with ROS Kinetic Kame

Poster session: A. Borkar, A. Adsule, A. Sinha and A. Tewari, "Aerial Surveillance and Tracking by Remote Agents (ASTRA)", Demo presented at the Army Technology Seminar (ARTECH), Manekshaw Centre, New Delhi, India, 11 January 2019.

Pattern switching between annular regions for a unicycle agent

(In collaboration with Twinkle Tripathy)

Here we implement a control law proposed for generating annular patterns centered about a stationary target point, capable of switching between different annular regions in run time for a agent modeled using unicycle kinematics. The control law is implemented using a Firebird V differential drive robot. The robot is localized in the experimentation area by a visual feedback system (developed in house) using a calibrated wide-angle lens camera. The localisation information with time stamp is communicated to the robot using XBee modules. The annular region bounds in the video turn green to indicate switching and the active bounds are shown in red.

Related publication: T. Tripathy, A. Sinha, H. Arya and A. Borkar, "Range Based Control Law to Generate Patterns With a Unicycle", Proceedings of 55th IEEE Conference on Decision and Control (CDC), Las Vegas, USA, December 2016, Pages 4979-4984.

Relative heading based pattern generation

(In collaboration with Shashank Agarwal and Twinkle Tripathy)

Here a unicycle based kinematics for the agent generates patterns with respect to a fixed target point, using a simple non-linear relative heading based control law. The control law is implemented using a Firebird V differential drive robot. The robot is localized in the experimentation area by a visual feedback system (developed in house) using a calibrated wide-angle lens camera. The localisation information with time stamp is communicated to the robot using XBee modules. The cases considered for experiments are shown in the adjacent video. The robot densely scan the annular region bounding the patterns generated. Thus the video only shows parts of the experiments.

Related publication: S. Agarwal, T. Tripathy, A. Borkar and A. Sinha, "Relative Heading based Pattern Generation", Proceedings of 20th IFAC World Congress, Toulouse, France, July 2017, Pages 12759-12764.

Target tracking with range-only information

Here we implement some control strategies on differential drive robots for tracking both stationary and moving targets, using only range information, i.e., bearing information to the target is not available. Some candidate controllers including continuous and switching strategies have been implemented to achieve this objective. It is also ensured that these controllers adhere to a set of theoretical conditions for capture of stationary and moving targets. In the experiments in the adjacent video the Firebird V robot acts as the pursuer and the spark V robot acts as the target. The localisation of the robots is done using a MATLAB based visual feedback system which uses four calibrated web cameras to detect a unique colored pattern mounted on the robots to localize them in the experimental area.

Related publication: G. Chaudhury, A. Sinha, T. Tripathy and A. Borkar, "Conditions for Target Tracking with Range-only Information", Robotics and Autonomous Systems, Volume 75, 2016, Pages 176-186.