Greetings! I am in the post-bachelor teacher education program at EMU. I have a bachelor of science degree from MSU in turfgrass science and worked in the golf industry prior to beginning work on my teacher certification. My certification will be in Integrated Science. I look forward to learning from other science educators involved in the research project and strengthening my teaching skills. During the summer I tend to play a lot of tennis, and I enjoy waterskiing and wakeboarding when I get the chance.
I grew up on a farm in Standish, Michigan. I have three amazing children named Reagan, Bearick, and Emerson. The love of my life is my Melissa and we live in Bay City raising our family. I went to SVSU for my Bachelor of Science in Edcation and have a Masters of Natural Science from SVSU as well. I teach at John Glenn High School in Bay City. I have been a teacher for 9 years. I worked for Dow automotive for 3 years in college in the Midland in automotive R and D. My group designed and tested various polymers for automotive applications. I enjoy spending time with my family, fishing, coaching, and playing basketball.
Many robots are teleoperated (i.e., operated from a distance) using video cameras for visual feedback to the human user, such as in bomb disposal, aerial recon, search and rescue, nuclear environments, underwater, space, and minimally-invasive surgery. In many of these applications, hand-eye misalignments between joystick and video feedback make control unintuitive – joystick input does not always align with visual feedback. This is especially challenging when the user repeatedly switches between camera views. For a given camera location near/on the robot, there is a corresponding “perfectly aligned” angle for the video image when displayed to the human – e.g., the user should have to look down to view an image that is looking down on the robot, or look right to see the robot from its left side. However, such alignment is not feasible when the camera moves (the video display would have to move) or when the user interface is meant to be portable (such as a 2D laptop screen). The goal of this project is to simplify teleoperation control by simulating proper alignment between joystick input and video feedback, by graphically rotating the video images. Suppose the user interface is a large screen showing two video images. When positioned in a flat left-and-right alignment, the images are misaligned … but by graphically distorting the images, we can simulate screen angle (think iTunes Coverflow, for example). Simply distorting the image to appear angled may (or may not) make the control more intuitive.