My research interests are in the fields of Haptic Perception, Tactile Sensing, Machine Learning, Manipulation, and Human-Robot Interaction. I am particularly interested in the domain of assistive robotics. I have also actively worked on Teleoperation Systems, Control Systems, and Psychophysics.

1. Rapid Haptic Perception using Force and Thermal Sensing

Tactile sensing can enable a robot to infer properties of its surroundings. Recent research has focused on robots that haptically perceive the world through exploratory behaviors that occur over tens of seconds. During manipulation, many opportunities arise for robots to gather information about the environment from brief (<= 2 seconds) contact due to simple motions (e.g., linear). The goal of our work was to enable robots to infer haptic properties under these conditions using force and thermal sensing.

We used a data-driven approach with various machine learning methods. Key challenges were obtaining adequate haptic data for training and developing methods that performed well on haptic data that differed from the training data due to common real-world phenomena. For example, haptic sensory signals vary significantly due to the robot, including its velocity, stiffness, and sensor temperature.

To collect suitable data, we used a variety of platforms, including simplified robots, handheld human-operated devices, and a mobile robot. We also generated synthetic data with physics-based models. Through careful empirical evaluation, we identified machine learning methods that better handled common signal variations. We also used physics-based models to characterize common perceptual ambiguities and predict the performance of data-driven methods. Overall, our research demonstrates the feasibility of robots inferring haptic properties from brief contact with objects in human environments. By using force and thermal sensing, our methods rapidly recognized materials, detected when objects moved, detected contact with people, and inferred other properties of the robot’s surroundings.

Link to Ph. D. Dissertation coming soon!

Some relevant videos below:

  • The video shows real-time sparse haptic map generation using whole-arm tactile skin while the robot is reaching into a cluttered environment and grabbing a bunch of keys. Initially, the robot has no idea about the environment and uses model-predictive control to reach its goal while maintaining the contact forces below a certain threshold. However, during the initial attempt to reach, it comes in contact with trunks and rigid objects in the environment which it identifies using HMMs and maps using brown dots (right). After the initial attempt fails, the robot updates its knowledge of the environment by identifying the location of the rigid objects in the environment like trunks using haptic mapping. Using this updated knowledge in its next turn, the robot successfully avoids them using a motion planner to successfully reach the goal and grab the keys.
  • The video is a collection of slides of my talk at RSS 2015, on heat-transfer based recognition of materials for short-duration contact and varying initial conditions.
  • The video shows the performance of a stretchable fabric-based force and thermal sensing skin for haptic perception during common non-prehensile manipulation tasks.
  • The video shows a mobile robot with a linear actuator touching various objects relevant to ADLs (activities of daily living) and IADLs (Intrumental activities of daily living) in the bathroom, bedroom, and kitchen of a household. There is a multimodal sensor attached at the end of the linear actuator which collects various multimodal (force, thermal) signals during contact with the object. The robot also records the kinematic motion signals using an encoder at the other end of the linear actuator. The video shows some examples of objects when the robot collected the data at varying velocities and times of the day over a period of three days. The experiments were performed on the objects as they were found (in-situ). These signals were used for multimodal tactile perception of objects in the home.
  • The video shows overhead view of our robot Cody with a stretchable, flexible tactile sensor array on its forearm and end-effector. The robot reaches through instrumented (with force-torque sensors for ground truth of contact force) clutter to a pre-specified goal location. Using the tactile sensing, including sensing that covers the articulated joints, the robot is successful. Whereas, without tactile sensing the robot fails to reach the goal.


  • The video shows the rapid categorization performance using HMMs while the robot reaches into clutter made of trunks and leaves. The robot uses data from the forearm tactile skin for online categorization. A taxel (tactile-pixel) is marked as a green dot if it is categorized as a leaf and as a brown dot if it is categorized as a trunk.


  • The video shows the performance of online Haptic Classification using information from incidental contact during goal-directed motion. The classification uses features such as Maximum Force, Contact Area, and Contact Motion for Classification purposes. These features are obtained from the Artificial Skin attached on the forearm of the Robot 'Cody'. The right side of the video shows an image representation of the unrolled skin Taxel array wherein darker pixels imply higher forces.


  • The video shows the performance of a PR2-robot equipped with a fabric-based tactile skin during goal-directed motion in unknown cluttered environments using online Haptic Classification and a bidirectional RRT planner. The RRT planner initially has no knowledge of the environment. During the motion, if the robot comes in contact with an obstacle and if the classification algorithm determines it to be a fixed obstacle, the RRT goes back to the initial position and re-plans with the updated knowledge of the environment. The robot can thus create a haptic map of the environment while reaching the goal. The RRT planner does not care if the obstacle is movable. I have included two cases in the above video to show the performance of PR2 with the same goal but different starting positions.


2. Combining Tactile Sensing and Vision for Haptic Mapping

We consider the problem of enabling a robot to efficiently obtain a dense haptic map of its visible surroundings using the complementary properties of vision and tactile sensing. Our approach assumes that visible surfaces that look similar to one another are likely to have similar haptic properties.

Some relevant videos below:

  • The robot 'DARCI' uses a tactile sleeve and a Kinect to create a dense haptic map while reaching into a cluttered environment. As the robot comes in incidental contact with the objects in the environment, it acquires local haptic information using the sleeve and propagates the local information to update its estimate of the haptic properties of the visible surface using the Kinect.
  • The video summarizes the basic idea, i.e. combining vision and touch using a dense conditional random field (CRF). It also shows a short clip of the robot DARCI generating a dense haptic map while reaching into a cluttered evironment. Our method combines the data from a tactile-sensing sleeve and a Kinect using the dense CRF.


3. Antagonistic Muscle based Robot Control for Physical Interactions

Robots are ever more present in human environments and effective physical human-robot interactions are essential to many applications. But to a person, these interactions rarely feel biological or equivalent to a human-human interactions. It is our goal to make robots feel more human-like, in the hopes of allowing more natural human-robot interactions. In this work, we examine a novel biologically-inspired control method, emulating antagonistic muscle pairs based on a nonlinear Hill model. The controller captures the muscle properties and dynamics and is driven solely by muscle activation levels. A human-robot experiment compares this approach to PD and PID controllers with equivalent impedances as well as to direct human-human interactions. The results show the promise of driving motors like muscles and allowing users to experience robots much like humans.

4. Haptic interaction during partner dance-based exercise for physical assistance

In this work, our long-term goal is to enable a robot to engage in partner dance for use in rehabilitation therapy, assessment, diagnosis, and scientific investigations of two-person whole-body motor coordination. Partner dance has been shown to improve balance and gait in people with Parkinson's disease and in older adults, which motivates our work. During partner dance, dance couples rely heavily on haptic interaction to convey motor intent such as speed and direction. We conducted human-subject studies to evaluate the feasibility of a wheeled mobile robot as a partnered stepping platform and to analyze its acceptance among older adults for physical assistance.

Some relevant videos below:

  • Here, we investigate the potential for a wheeled mobile robot with a human-like upper-body to perform partnered stepping with people based on the forces applied to its end effectors. Blindfolded expert dancers performed a forward/backward walking step to a recorded drum beat while holding the robot's end effectors. We varied the admittance gain of the robot's mobile base controller and the stiffness of the robot's arms. High admittance gain and high arm stiffness conditions resulted in significantly improved performance with respect to subjective and objective measures. Biomechanical measures such as the human hand to human sternum distance, center-of-mass of leader to center-of-mass of follower (CoM-CoM) distance, and interaction forces correlated with the expert dancers' subjective ratings of their interactions with the robot. In response to a final questionnaire, 1/10 expert dancers strongly agreed, 5/10 agreed, and 1/10 disagreed with the statement "The robot was a good follower." 2/10 strongly agreed, 3/10 agreed, and 2/10 disagreed with the statement "The robot was fun to dance with." The remaining participants were neutral with respect to these two questions.
  • To potentially facilitate healthy aging by engaging older adults in partner dance-based exercise, older adults would need to be accepting of partner dancing with a robot. Using methods from the technology acceptance literature, we conducted a study with healthy older adults to investigate their acceptance of robots for partner dance-based exercise. Participants were generally accepting of the robot for partner dance-based exercise, tending to perceive it as useful, easy to use, and enjoyable. Through a qualitative data analysis of structured interview data, we also identified facilitators and barriers to acceptance of robots for partner dance-based exercise. Throughout the study, our robot used admittance control to successfully dance with older adults, demonstrating the feasibility of this method. Overall, our results suggest that robots could successfully engage older adults in partner dance-based exercise.

5. Bio-inspired Robot Arm and Hand Coordinated Grasping/Manipulation Control

We developed a novel control law to exhibit human-motion characteristics in redundant robot arm systems for reaching tasks. This newly developed method nullifies the need for the computation of pseudo-inverse of Jacobian while the formulation and optimization of any artificial performance index is not necessary. The proposed control law models the time-varying properties of the muscle stiffness and damping as well as the low-pass filter characteristics of human muscles. The newly developed control law uses a time-varying damping shaping matrix and a bijective joint muscle mapping function to describe the human motion characteristics for reaching motion like quasi-straight line trajectory of the end-effector and symmetric bell shaped velocity profile. The aspect of self-motion and repeatability, which are inherent in human-motion, are also analyzed and successfully modeled using the proposed method. Experiment results show the efficacy of the newly developed algorithm in describing the human-motion characteristics. We performed extensive simulations by extending the above control law for hand-arm coordination tasks in reach-to-grasp tasks for grasping objects of different shapes and sizes.

Some relevant videos below:

  • The video shows the performance of the a Hand-Arm system while coordinating to reach and grasp objects of different shapes. The Arm is a 7-DOF redundant system while the hand is a 4-fingered 12-DOF hand. The reach-to-grasp task is carried out using my newly developed control law as described here. The simulation is carried out using RoboticsLab Software.
  • The video shows the performance of the same 7-DOF Robot Arm in executing human-like reaching motion with compliance. It exhibits quasi-straight line trajectory of the end-effector and a symmetric bell-shaped velocity profile.


  • The video shows the performance of a 7-DOF redundant robot-arm system in maintaining the end-effector position while external disturbances (force correspond to red lines in video: force applied using mouse) perturb the robot's motion. I implemented a Task-Space Disturbance Observer to analyze its effect on the Null-Space Motion of the 7-DOF Robot Arm.


  • The video shows the system characterization of a 7-DOF Robot Arm in KIST, South-Korea. The performance of the arm is compared between cases without compensation and with friction and gravity compensation using my method.


6. Analysis of Stability and Performance of Telesurgical Systems

Telerobotic surgical systems, although researched quite extensively, still have stability and performance issues. This is mainly because little consideration has been given to the effect of hardware or environment characteristics as well as practical surgical constraints in previous stability and performance analyses. This work focuses on the stability aspect in greater depth by considering the effect of surgical soft-tissue environments and the damping of the master device and hence, employs passivity approach which relates the haptic device damping with the environment characteristics. There have been a few models proposed to describe the visco-elastic behavior of soft tissues, including the popular Maxwell and Voigt models. Choosing a best soft-tissue model is still unresolved. This research provides passivity analysis of teleoperation of soft tissues, based on the Kelvin model that can overcome the disadvantages of the Maxwell or Voigt models. The passivity analysis results in new criterion for design of the haptic interface as well as the control. The effect of discretization method on the passivity criteria is also discussed. Simulation results show that this newly developed criterion increases the domain of environments that can be passively interacted by the haptic master. This newly developed criterion is, therefore, a less conservative criteria than that developed based on the popular Voigt environment model, which signifies that less device damping is required to maintain the passivity of the teleoperation system which would in turn increase its performance.

Another important issue in the performance of telesurgical systems is the practical constraint of force feedback because it is difficult to use force sensors in surgical environments. This is because of the inherent noise, and limitations in the sensor-deployable place and space. A scheme to replace the direct use of force sensors in the slave side may alleviate this problem. This work, hence, develops a method for environment force estimation using the concept of disturbance observer. Simulation results show efficacy of the proposed method. The slave device successfully tracks the position of the master device, and the estimation error quickly becomes negligible. The experimental results validate the theory and simulation outcome.

Link to M.S. Thesis