Skip to content

Learn ROCm with chatbot which powered by AMD ROCm solution.

License

Notifications You must be signed in to change notification settings

alexhegit/Ask4ROCm_Chatbot

Repository files navigation

Ask4ROCm_Chatbot

Learn ROCm with chatbot which powered by AMD ROCm solution.

Demo Show

DIR Tree

Ask4ROCm_Chatbot/
├── ask4rocm.py  //chat mode App
├── qa4rocm.py  //query mode App
├── data  // source for indexing
│   └── rocm-docs-amd-com-radeon-en-latest.pdf
├── LICENSE
├── README.md
├── requirements.txt
├── resources
│   └── Ask4ROCm_Chatbot_Demo.gif
└── tools
    └── query_gpu.py

Run it

(See the follow sections about Hardware and Software to make sure the environment is ready before run it.)

streamlit run ask4rocm_app.py

App Features

  • Select EngineMode (Chat/QnA)
  • Select LLM (llama3, Qwen2)
  • Set tempereture of LLM
  • Support upload files and rebuild indexing and saved
  • Clear Chat History
  • Help URL of ROCm and source code of this repo

Supported Hardware

Any AMD GPUs supported by ROCm should work for this Chatbot. You may find the GPU list from here.

  • AMD CDNA GPU: MI300 / MI200 / MI100, etc
  • AMD RDNA GPU: Radeon 7000 series / Radeon 6000 series / iGPU 780M ,etc
  • AMD CPU (w/o ROCm)

NOTE

Software Installation

This chatbot depends on many OSS projects.

  • Ubuntu OS
  • AMD ROCm
  • WebUI: Streamlit
  • RAG pipeline: LlamaIndex
  • VectorDB: ChromaDB
  • LLM inference enginee: Ollama
    • LLM: Llama3-7b/tinyllama or others
    • Embedding Model: nomic-embed-text

ROCm

Refer to https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html to install the ROCm components.

Then to setup the Python base environment to run the chatbot application. You may use conda or python venv to manage it.

Ollama

Please install Ollama refer to https://ollama.com/download

curl -fsSL https://ollama.com/install.sh | sh

Then download the models as reqerired.

ollama pull llama3
ollama pull tinyllama
ollama pull nomic-embed-text

LlamaIndex

pip install llama-index
pip install llama-index-core llama-index-readers-file 
pip install llama-index-llms-ollama llama-index-embeddings-ollama
pip install llama-index-vector-stores-chroma

Streamlit

pip install streamlit

Please use pip install -r requirements.txt for easy installation.


PyTorch_ROCm

This Apps does not depend on PyTorch at NOW. But we suggest to install the PyTorch-rocm for further work.

Install PyTorch-rocm from https://pytorch.org/ e.g. on linux

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/rocm6.0

Appendix

If you is a beginer of ROCm ,LLamaIndex and Ollama. Here are other repos may help you to learn and hands-on with them.

  1. Step by Step from labs in jupyter-notebook to webui demo for creating the RAG apps with ROCm+LLamaIndex+Ollama:
    https://github.com/alexhegit/RAG_LLM_QnA_Assistant

  2. Misc hands-on of ROCm: https://github.com/alexhegit/Playing-with-ROCm


@misc{AlexTryMachineLearning,
  author =   {He Ye (Alex)},
  title =    {Ask4ROCm_Chatbot: assist to learn ROCm with RAG},
  howpublished = {\url{https://alexhegit.github.io/}},
  year = {2024--}
}

About

Learn ROCm with chatbot which powered by AMD ROCm solution.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages