Welcome to the Lachesis Chatbot Development Project, part of the larger Redback Senior Project (Lachesis) under Redback Operations (Capstone project company within the School of Information Technology at Deakin University). This project is dedicated to harnessing the potential of wearable technology to significantly enhance the quality of life for the elderly. By leveraging advanced data analytics, innovative web platforms, and sophisticated mobile app development tools, the Redback Senior Project aims to create a comprehensive support system for seniors.
This repository specifically focuses on the development of an AI-powered chatbot that complements the wearable technology, providing personalised interactions across medical and personal health domains.
The Lachesis Chatbot is designed to assist elderly users by providing them with reliable information and support through natural language interactions. The chatbot leverages advanced machine learning models and a retrieval-augmented generation (RAG) architecture to deliver accurate and contextually relevant responses.
- LLM Integration: Integrates large language models (LLMs) for dynamic, personalised, and context-aware conversations.
- Vector Embedding: Utilises
SentenceTransformer
for generating meaningful semantic embeddings, ensuring accurate text representation. - RAG Architecture: Combines document retrieval with generation models for contextually relevant, real-time responses.
- Data Processing Pipeline: Includes scripts for converting PDFs to Markdown, splitting text, generating embeddings, and uploading data to Qdrant.
- User-Friendly Interface: Provides an easy-to-use interface for users through a Streamlit-based frontend.
project/
│
├── data_processing/
│ ├── __init__.py
│ ├── pdf_to_markdown.py
│ ├── text_splitter.py
│ └── utils.py
│
├── embedding/
│ ├── __init__.py
│ ├── sentence_encoder.py
│
├── vector_db/
│ ├── __init__.py
│ ├── qdrant_client.py
│
├── chat_model/
│ ├── __init__.py
│ ├── chat_groq.py
│
├── run_pipeline.py # Script to test the processing pipeline
├── chatbot_ui.py # Streamlit application for the chatbot UI
├── chatbot_api.py # FastAPI server for the chatbot
└── requirements.txt # Project dependencies
- Docker
- Visual Studio Code (VS Code) with the Dev Containers extension
- Install Docker Desktop
- Enable WSL 2
- Start Docker Desktop
- Verify Docker Installation
- Build the Docker Image
- Run the Docker Container locally
- Install Visual Studio Code
- Install the Dev Containers Extension
- Clone the Repository
- Reopen in Container
- VS Code will build the Docker container and open the project inside the container.
Set the necessary environment variables for the Groq API key:
export GROQ_API_KEY=your_groq_api_key
python run_pipeline.py
-
Running the FastAPI Server:
- Convert PDFs to Markdown, split text, generate embeddings, and upload data to Qdrant
- Start the FastAPI server to handle API requests
python chatbot_api.py
-
Running the Streamlit Application:
- Using a different terminal, start the Streamlit application to provide a user interface for the chatbot
streamlit run chatbot_ui.py --server.address 127.0.0.1
-
Interact with the Chatbot:
- Open your web browser and go to http://localhost:8501
- Enter your query in the input box and click "Get Response" to receive a response from the chatbot.
This chatbot is part of the overall Redback Senior initiative, aimed at creating intelligent support systems for the elderly, ensuring their well-being, and providing them with reliable access to important information through wearable technology.
Feel free to explore, contribute, and be part of this important initiative!