DPPE: Dense Pose Estimation in a Plenoxels Environment using Gradient Approximation

Christopher Kolios*, Yeganeh Bahoo*, and Sajad Saeedi*

* Toronto Metropolitan University


Coming Soon

Epoch 0

Epoch 100

Epoch 999

Abstract

We present DPPE, a dense pose estimation algorithm that functions over a Plenoxels environment. Recent advances in neural radiance field techniques have shown that it is a powerful tool for environment representation. More recent neural rendering algorithms have significantly improved both training duration and rendering speed. Plenoxels introduced a fully-differentiable radiance field technique that uses Plenoptic volume elements contained in voxels for rendering, offering reduced training times and better rendering accuracy, while also eliminating the neural net component. In this work, we introduce a 6-DoF monocular RGB-only pose estimation procedure for Plenoxels, which seeks to recover the ground truth camera pose after a perturbation. We employ a variation of classical template matching techniques, using stochastic gradient descent to optimize the pose by minimizing errors in re-rendering. In particular, we examine an approach that takes advantage of the rapid rendering speed of Plenoxels to numerically approximate part of the pose gradient, using a central differencing technique. We show that such methods are effective in pose estimation. Finally, we perform ablations over key components of the problem space, with a particular focus on image subsampling and Plenoxel grid resolution.

Video

Method

A visualization of the analysis pipeline. A star indicates a significant contribution of this work towards the module. First, a Plenoxels grid is trained for a scene. Then, a ground-truth pose is perturbed, with the extent of perturbation depending on the test being run. The trained Plenoxels grid, ground-truth pose, and perturbed pose are passed into the pose estimation process, which outputs the final pose after optimization. The scene can be rendered from any pose, to visualize camera position and orientation.

Contact

If you have any questions, feel free to reach out to us via the following emails: {ckolios, bahoo, s.saeedi}@torontomu.ca

BibTex

TBD