Publications‎ > ‎

Project Page

Constructive Learning for Robots
Amarjot SinghSrikrishna Karanam, Devinder Kumar


The details for the paper "Constructive Learning for Human-Robot Interaction" published in IEEE Potentials, available at: [Link], are provided below: (Best Paper Award (3rd Position - Undergraduate) in IEEE Region 10 Paper Contest)

Ø  Proposed an algorithm to improve constructive learning for Human-Robot interaction.

Ø  Learning states (positive or negative) of the subject taught by a tutor are determined by analyzing facial expression.

Ø  Facial expression are determined using Tree Argumented Naive (TAN) Bayes Classifier.

Ø  The corrective actions taken by the tutor to improve learning are mimicked by a Biped Robot.

Ø  The system produces an average emotion recognition accuracy of 92%.

Ø  The system successfully improved the learning of human subjects.

Algorithm: The proposed algorithm identifies facial expressions using TAN classifier. This information is further passed to the constructive learning system.

Figure 1: The graphs shows (a) TAN classifier and (b) A graphical relationship between emotion and learning.

Qualitative Results:

The proposed algorithm is tested on three human subject videos recorded in various environments, including different emotional expressions. The expressions are identified using TAN classifier. This information is given to the constructive learning system further used to improve the learning of the human subjects.


Figure 2: The figure shows a human subject with different facial expressions. 

Quantitative Results:


Figure 3: The graphs shows (a) Learning rate of the subject with respect to the quadrants and (b) Learning rate of the subject with respect to the time.

Maltab Code:

Ø The code can be download from this link [code].

Terms of Use:

If you choose to use our work in your research, please cite the following paper:
	author = {Amarjot Singh and Srikrishna Karanam and Devinder Kumar},
	title = {Constructive Learning for Human-Robot Interaction},
	journal = {IEEE Potentials},
	volume = {32},
	number = {4},
	year = {2013},}


[1] R. Murphy, T. Nomura, A. Billard, and J. Burke, “Human–robot interaction,” IEEE Robot. Automat. Mag., vol. 17, pp. 85–89, 2010.

[2] B. Fagin and L. Merkle, “Measuring the effectiveness of robots in teaching computer science,” in Proc. 34th SIGCSE Tech. Symp. Computer 

     Science Education, Jan. 2003, vol. 35, no. 1, pp. 307–311.

[3] R. W. Picard, “Affective computing: Challenges,” Int. J. Human–Comput. Stud., vol. 59, no. 1–2, pp. 55–64, 2003.

[4] S. Afzal, T. M. Sezgin, Y. Gao, and P. Robinson, “Perception of emotional expressions in different representations using facial feature points,” in 

      Proc. Affective Computing and Intelligent Interaction and Workshops, 2009, pp. 1–6.

[5] P. Ekman, “An argument for basic emotions,” Cogn. Emot., vol. 6, no. 3–4, pp. 169–300, 1992.