Brandon J. Woodard

about me

Brandon Woodard is a Ph.D. fellow at Brown University and graduate researcher for the Brown Visual Computing group under the advisement of Dr. Jeff Huang.

Prior to attending Brown, he received his B.S. in Computer Science at CSUMB and has interned for Nvidia, MIT Lincoln Laboratory, NASA’s Jet Propulsion Laboratory, and Intel. He is interested in inferring user intent and the design of user-centered systems. His current research is focused on inferring user intent in augmented reality.

Twitter: Brandon Woodard (@channelbrandon_) / Twitter

LinkedIn: https://www.linkedin.com/in/brandon-woodard-a71270a9/

Contact: brandon_woodard@brown.edu

Selected Works

Portal-ble

Portal-ble is portable augmented reality where you look through your phone as a portal into a virtual world. The virtual and real are merged into a single view, so you can interact with both virtual and real objects with your hand: sketching, manipulating, and throwing them.

We're not talking about gestures in the air, but rather your hands in the space where the virtual object would be.

Which coffee in the following image is real? Actually, both cups are virtual, so grab them both. And the chair with the red seat is virtual, but the gray ones are real. Now pick up and move around any of the chair


Increasing the use of LiDAR

for Forestry Applications

We are developing a model to identify more of the underlying structures captured by the Global Ecosystem Dynamics Investigation LiDAR (GEDI) aboard the International Space Station.

Visualizing NASA Space Mission data in Virtual Reality

Under the leadership of Dr. Robert Anderson I developed a virtual reality platform that displays space exploration data from the NASA's Jet Propulsion Laboratory.

The goal of this project is to stream line JPL's mission data onto a single platform where researchers and educators can access remotely without the need to visit the Regional Planetary Image Facility (RPIF) located at JPL. Mission data available on the platform include digitized maps, spacecraft manuals, rare photos, and 360 views of the target planet to learn about the mission in a first person perspective.

The information is organized by a timeline of missions and their respective space bodies (planets, asteroids, interstellar,etc.).

Target platforms are Google Daydream/Cardboard and Oculus.

Watch the video to see how it works here: https://youtu.be/7y52UulWeIg

Low-Cost 3D printed hand with flex sensors

Working under the supervision of Dr. Krzystof Pietroszek and Dr. Christian Eckhardt, we developed a low-cost robotic hand with flex sensor hardware as an alternative to myo electric sensors typically used for prosthetics.

This project was featured in CSUMB's University Magazine under 'A New Reality'. Read here!

Wearable Technology in VR

Under the supervision of Dr. Krzysztof Pietroszek, my team and I develop a new way to interact in Virtual Reality. This involves an off the shelf smart watch and is intended for mobile-based head mounted displays (eg. Samsung Gear VR, Google Daydream/Cardboard)

This was later accepted in the ACM's VRST conference where I presented our work in Munich, Germany at the Technical University of Munich.

Publication can be found here, ACM Digital Library