Projects

Indoor and outdoor integrated navigation tech. (2022)

Visual-inertial odometry for a quadruped robot

We showcase ensemble visual-inertial odometry (EnVIO) working on a quadruped robot. We saw a lot of challenges in this platform due to a sudden change in motion as indicated by the accelerometer measurements in the bottom right. We manually controlled the robot in a campus environment using off-the-shelf sensors. 

The ground-truth position was marked by a laser tracker (Leica TS-16). We post-processed EnVIO in NUC i7 having CPU usages ~200% and ~40ms processing time per frame. In a total of 96.5m trajectory, EnVIO had 0.70m time RMSE.

Indoor personal navigation (2020)

Pedestrian dead reckoning and visual-inertial odometry fusion

Fast motion and lens occlusion are challenging problems for a vision-based navigation system. We tackle this by fusing pedestrian dead reckoning (PDR) and visual-inertial odometry (VIO). PDR estimates sensor position by step detection and step length estimation only using measurements from an IMU.

This algorithm was developed by me and Dr. Soyoung Park and sponsored by Samsung Electronics through the Samsung Smart Campus Research Center.

Integrated navigation system for Lunar rover (2018)

Visual-inertial navigation system for a Lunar rover

For a future mission of Lunar surface exploration by a Korean rover, our team, Navigation and Electronic System LAB at Seoul National University, developed a visual-inertial navigation system that is based on the seminal work of Mourikis and Roumeliotis. This work was sponsored by the Ministry of Science and ICT, Republic of Korea.

We tested our system in simulated environments, datasets collected by a MAV and a car. We also built a rover platform to gather stereo images, IMU readings, and RTK GPS module for the ground-truth for field testing. 

CANSAT competition Korea (2016)

2-DOF camera gimbal stabilizer for CANSAT

As an undergraduate project, I and my teammates took part in the CANSAT competition Korea in 2016.  Our primary mission was to track a target on the ground by a gimbaled camera while descending from 500m altitude and compute cansat's position relative to the target to imitate a planetary landing mission. I was involved in programming for a gimbal stabilizer in MCU, vision processing for position computation, and ground station GUI.

We won the second prize in the competition !