CNIT-AST: Research Project



Team: Peter, Jak

Instructor: Dr. Min

[Code] [Video] [Paper]


Blind Guiding Robot


Project Introduction

The project we're proposing is a robot system that helps visually impaired users and assists them with indoor navigation. To achieve this, we will have the robot lead users and act as a waypoint or checkpoint, giving audio queues to help users navigate to the desired path.


Project Motivation and Approach

For the visually impaired, navigating a new environment is extremely challenging. The difficulty in gathering information about the environment can provide a daunting experience and discourage visually impaired individuals from going outside by themselves. This area is where assistive technologies such as robots can provide value to these individuals and help visually impaired users navigate through their environment. The use of robots for indoor navigation is heavily researched within the field of robotics and assistive technology. However, the most common way that the issue is approached is by having the robot walk alongside the user, similar to a trained guide dog. One problem with this approach is that it attracts unwanted attention to the visually impaired user and creates feelings of discomfort in their experience.


To avoid attracting unnecessary attention to the user, we decided on a different approach for the interaction between the user and the robot. The robot is designed to guide visually impaired users with indoor navigation by acting as a waypoint for the user throughout the path. During use, the robot will lead the user in the path to a waypoint in close proximity ahead of the user. Then as the user approaches the waypoint, the robot will move to the next waypoint, repeating until the user reaches the destination. This approach means that the robot is detached from the user to avoid attracting extra attention while delivering a semi-assisted navigation experience, where the user can start to become more familiar with the environment as well.

Team Member

Peter


Peter is a graduate student supervised by Dr. Min from the department of Computer and Information Technology at Purdue University. He is focusing on develop Robotics Technology and Human-Robot Interaction Technology.

Jak

Jak is a graduate student studying User Experience Design student at Purdue University with a undergraduate degree in Mechanical Engineering from Purdue University. He’s UX Designer who is passionate about emerging technologies and making a positive impact for users and customers. He transitioned from Mechanical Engineering to UX Design because he aspire to merge product design with my engineering background to go beyond the data-sheet and design with purpose and compassion for the user.

Methodology

Literature Review (Blogpost 2)

Literature Review for related works

We developed serveral navigation algorithms (such as Rapidly-exploring Random Trees) in the simulator.

We worked on training the model for the robot's user recognition system as well as a VR environment for testing Human Robot Interactions

User Detection


Plan

Project Schedule

Week 6:

  • Literature Review


Week 7 (This week):

  • Proposal Presentation

  • Project Proposal Report


Week 8 - 10

  • Develop navigation algorithm

  • Test navigation algorithm within a simulation


Week 11 - 12

  • Test navigation algorithm in the real world

  • Develop features for human recognition for waypoint system


Week 13:

  • Testing

  • Begin final report


Week 14:

  • Finish final report



References

[1]. Wolfram Burgard, Armin B. Cremers, Dieter Fox, Dirk Hähnel, Gerhard Lakemeyer, Dirk Schulz, Walter Steiner, and Sebastian Thrun. 1998. The Interactive Museum Tour-Guide Robot. In Proceedings of the AAAI Conference on Artificial Intelligence (AAAI). 11–18.

[2]. Sebastian Thrun, Michael Beetz, Maren Bennewitz, Wolfram Burgard, Armin B. Cremers, Frank Dellaert, Dieter Fox, Dirk Hähnel, Chuck Rosenberg, Nicholas Roy, Jamieson Schulte, and Dirk Schulz. 2000. Probabilistic Algorithms and the Interactive Museum Tour-Guide Robot MINERVA. The International Journal of Robotics Research 19, 11 (2000), 972–999.

[3]. SPENCER. 2016. SPENCER. http://www.spencer.eu. Accessed: 2021-02-18.

[4]. CROWDBOT. 2021. CROWDBOT. http://www.crowdbot.eu. Accessed: 2021-02-18

[5]. P. Naughton and K. Hauser. Structured Action Prediction for Teleoperation in Open Worlds. IEEE Robotics and Automation Letters, 7(2):3099-3105, April 2022. doi: 10.1109/LRA.2022.3145953.

[6]. Ging, Simon, et al. "Coot: Cooperative hierarchical transformer for video-text representation learning." Advances in neural information processing systems 33 (2020): 22605-22618.

[7]. Wang, Chien-Yao, Alexey Bochkovskiy, and Hong-Yuan Mark Liao. "YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors." arXiv preprint arXiv:2207.02696 (2022).

[8]. Jain, Ashesh, et al. "Structural-rnn: Deep learning on spatio-temporal graphs." Proceedings of the ieee conference on computer vision and pattern recognition. 2016.