Introduction
The project aims to create an immersive and interactive gaming experience by using Python, OpenCV, MediaPipe, and PyAutoGui to take camera motion input and turn it into output for a Subway Surfers game. The project utilizes the computer vision capabilities of OpenCV to track the player's body movement and use it to control the character in the game. MediaPipe is used to perform real-time body tracking and hand landmarks detection, which allows for more realistic virtual reality experiences. PyAutoGui is used to automate the game controls, allowing for more intuitive and natural interactions with the game. This project will demonstrate how the integration of these technologies can create a more engaging and interactive gaming experience, making it more fun and motivating for people to stay active.
Driving Question
How can we use Python and OpenCV to make computer games involve more physical movement?