Players and Football Tracking Using YOLO v8
Players and Football Tracking Using YOLO v8
Platform: Google colab
Machine vision plays a pivotal role in automating tasks and enhancing accuracy across industries. Leveraging the power of YOLOv8, I developed a robust system for tracking players and a football in real-time. This state-of-the-art object detection model ensures precise and efficient tracking, ideal for sports analytics and improving game strategy insights. Through advanced algorithms, my work demonstrates how machine vision can revolutionize the way we analyze and interact with dynamic environments.
Figure 1: Use of Yolov8 in Video
Figure 2: Side by Side comparison of accuracy
Figure 3: Some sample of training phase
Real-Time Face Mesh
Platform: Google Colab
Overview: A real-time face-tracking demo that overlays a dense 468-point face mesh and key facial contours on live webcam video.
Stack: Python · OpenCV · MediaPipe Face Mesh
What it does:
Detects a face and predicts 3D landmarks per frame; draws triangulated mesh and feature outlines (eyes, nose, lips).
Provides stable tracking suitable for head-pose cues, AR overlays, and expression features.
Pipeline: webcam → RGB frame → Face Mesh model → landmarks → smoothing & drawing → display.
Notes: Runs in real time on CPU; works in typical indoor lighting.
Hand Gesture Recognition (Right-Left-(0-5))
Platform: Google Colab
Overview: A webcam-based hand-tracking system that recognizes digit gestures and hand (0–5) using 21 hand landmarks per hand.
Stack: Python · OpenCV · MediaPipe Hands · NumPy
What it does:
Tracks one or two hands; labels left/right, draws landmarks, and skeleton.
Rule-based classifier (finger-state from joint angles) maps poses to numbers and overlays the result on the frame.
Useful for touch-free UI, quick input for HRI/robot control, or prototyping sign-like interactions.
Pipeline: webcam → RGB frame → Hands model → 21 landmarks/hand → finger state logic → class label → display.