Focus: Stereo Computer Vision with Neck.
Code Optimization: Tighter code, meaning no duplication of code function.
Parent Transforms: Use Unity's transform parenting to create a rigCam for each eye.
Turn Off Feedback: Use Vector3 instead of game objects, ul, lr, ll screen corner points for example.
Feature: When each eye loses tracking, it stays where it is.
Physics: Turn head to follow ball.
Game: The bot sees the green ball and follows it around, always facing it.
Navigation: Pitch and Yaw based on best pixel location in relation to screen center.
Distance: No distance for this POC.
Finding Target: The head color is grey when the ball is not found.
ynManual: To hide the ball under the head.
Smoothing: The head responds slowly and smoothly.
Unity Layers: A layer named "picturePlane" needs to be added to the project. It can be done programmatically, but.
Closer look.
Searching (dark grey), then finding the green ball.
Testing eye movement.
Blue ball is the estimated location. This estimation feature is for a future POC.
Manual test of neck turning pitch and yaw, with roll zeroed out.
Full tracking (when target is in view)
Use the GitHub link above for the most recent code. The above code snapshot is almost current.