A Socratic learning QuizBot using Streamlit and OpenAI API for PDF-based educational dialogues.
Created by Sean Harrington
Director of Technology Innovation
University of Oklahoma College of Law
https://law.ou.edu/faculty-and-staff/sean-harrington
- PDF processing with support for complex formatting
- Socratic dialogue generation using OpenAI GPT-4
- Student analytics and engagement tracking (1-3 grade scale based on interaction level)
- Instructor dashboard for content management
- Multi-user support with role-based access
- Conversation history and transcript generation
- Optional Ollama integration for local LLM support
- Python 3.8+
- PostgreSQL database
- OpenAI API key (or Ollama for local LLM support)
-
Create a
.env
file in the root directory using.env.example
as a template:cp .env.example .env
-
Configure your environment variables in the
.env
file:# OpenAI Configuration (if not using Ollama) OPENAI_API_KEY=your_openai_api_key # PostgreSQL Database Configuration PGDATABASE=your_database_name PGUSER=your_database_user PGPASSWORD=your_database_password PGHOST=localhost PGPORT=5432
# Set these only if using Ollama instead of OpenAI USE_OLLAMA=true OLLAMA_HOST=http://localhost:11434 OLLAMA_MODEL=[insert name of model]
The current chat completions are tailored for OpenAI. If you are using a local model (llama7b, Mistral7b, etc.) you will have to adjust services/openai_service.py to conform to your model's requirements.
- Clone the repository
- Install dependencies:
pip install -r requirements.txt
- Set up PostgreSQL database:
CREATE DATABASE quizbot; CREATE USER quizbot_user WITH PASSWORD 'your_password'; GRANT ALL PRIVILEGES ON DATABASE quizbot TO quizbot_user;
- Configure environment variables as described above
- Run the application:
streamlit run main.py
- Place your PDF materials in the Readings folder
- Start a new quiz to engage in Socratic dialogue
- View analytics and download conversation transcripts
Note: The application will process any PDF files placed in the Readings folder automatically.
QuizBot can be configured to use Ollama as an alternative to OpenAI for local LLM support. This is useful for:
- Running without internet connectivity
- Privacy-sensitive environments
- Cost-free operation
- Testing and development
-
Install Ollama:
# Linux curl -fsSL https://ollama.com/install.sh | sh # For MacOS and Windows, download from https://ollama.com
-
Pull the required model:
ollama pull mistral:7b
-
Install Python dependencies:
pip install ollama langchain
-
Configure QuizBot for Ollama:
- Set
USE_OLLAMA=true
in your environment variables - No OpenAI API key required when using Ollama
- Set
-
Start the Ollama service:
ollama serve
-
Run QuizBot normally:
streamlit run main.py
The application will automatically use the local Ollama model for:
- Generating questions
- Processing responses
- Creating summaries
- Managing dialogue flow
Note: When using Ollama, response times may vary depending on your hardware. GPU support is recommended for optimal performance.
-
Database Connection:
- Verify PostgreSQL is running
- Check database credentials in .env
- Ensure database exists and user has proper permissions
-
OpenAI API:
- Verify API key is valid
- Check for API rate limits
- Ensure internet connectivity
-
Ollama Integration:
- Verify Ollama service is running
- Check if model is downloaded
- Confirm USE_OLLAMA setting
- "Database connection failed": Check PostgreSQL configuration
- "OpenAI API error": Verify API key and rate limits
- "Ollama service not found": Ensure Ollama is running
For additional support, please open an issue on GitHub.