This web-based application evaluates scanned subjective answer sheets by comparing student responses with model answers using Optical Character Recognition (OCR) and Natural Language Processing (NLP) techniques. It supports PDF file uploads for both student and model answer sheets.
- Upload PDF Answer Sheets: Upload scanned PDF files containing student answers.
- OCR for Text Extraction: Extracts text from the images in the PDF using Tesseract OCR.
- Text Comparison: Compares the extracted text with model answers to evaluate accuracy.
- Model Accuracy Score: The application calculates an accuracy score based on the comparison between the student's answer and the model answer.
To run this project locally, follow these instructions:
- Python 3.x
- Tesseract OCR
- Required Python libraries (listed in
requirements.txt
)
-
Clone the repository:
git clone https://github.com/Praveen-koujalagi/Subjective-Answer-Evaluation-System.git cd Subjective-Answer-Evaluation-System
-
Create a virtual environment:
python -m venv env
-
Activate the virtual environment:
- On Windows:
.\env\Scripts\activate
- On macOS/Linux:
source env/bin/activate
- On Windows:
-
Install the dependencies from
requirements.txt
:pip install -r requirements.txt
To run the application, use:
streamlit run app.py
- Praveen Koujalagi
- S Sarvesh Balaji
- Sujit G