In this hand tracking project, we have utilized computer vision techniques with Python, OpenCV, and Mediapipe to detect and track hand gestures in real-time. Our machine learning model has been trained to detect 21 specific hand landmarks, enabling us to determine the hand's position, orientation, and movements.
Using OpenCV and Mediapipe, we can capture real-time video feed from a camera and apply our model to accurately identify and track various hand gestures, such as fist, thumb's up, and palm. Our project also includes a user interface to display the detected hand landmarks, providing real-time feedback on the user's hand movements. Additionally, our hand tracking application features a Frames Per Second (FPS) counter, allowing us to monitor performance and optimize for real-time performance.
With its versatility, robustness, and efficiency, our hand tracking project can be easily customized and extended to support additional hand gestures and applications with minimal code modifications. This solution can be integrated into various projects and applications, including sign language recognition, virtual reality, and more.