Engagement Analysis with Head Pose Estimation is a computer vision project that utilizes Mediapipe library for facial landmarks detection, OpenCV for computer vision tasks, and NumPy/Pandas for data manipulation. It estimates head pose and gaze direction to determine whether the user is engaged or not.
- Head pose estimation.
- Gaze direction analysis.
- Engagement analysis based on head pose and gaze direction.
- Real-time visualization of facial landmarks and analysis results.
- Track the duration of continuous eye contact with the camera to measure attention span.
- Analyze variations in gaze direction to understand shifts in focus.
Adjustable parameters are available in the script for customization:
draw_gaze
: Set toTrue
to display gaze vectors.draw_full_axis
: Set toTrue
to display the full rotation axis.draw_headpose
: Set toTrue
to display head pose vectors.x_score_multiplier
andy_score_multiplier
: Multipliers to adjust the impact of gaze on head pose estimation.threshold
: Threshold for averaging gaze scores between frames.
-
Clone the repository:
git clone https://github.com/Manoj-2702/EyeTracking.git
-
Install all the required Dependencies: Requirements:
- OpenCV
- Mediapipe
- Numpy
- Pandas
pip install -r requirements.txt
-
Usage:
python eye_nose_detector.py
The script provides real-time visual feedback on engagement and gaze direction. It also generates a summary of engagement statements and percentages after execution.
Engagement.Analysis.2023-11-12.20-22-11.mp4
This project is licensed under the MIT License - see the LICENSE file for details.