Skip to content

Self-host a ChatGPT-style web interface for Ollama 🦙

License

Notifications You must be signed in to change notification settings

clouvet/ollama-open-webui

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Run Ollama and Open WebUI on the same Fly Machine!


Deploy

Everyone loves a one-liner — let's clone the repo and deploy the app with flyctl:

fly launch --from https://github.com/fly-apps/ollama-open-webui

That's it! When you visit https://[app].fly.dev you should see the Open WebUI interface where you can log in and create the initial admin user. You can then optionally disable signups and make the app private by setting ENABLE_SIGNUP = "false" in your fly.toml env variables section.

Important

By default, the app runs on Fly GPUs — Nvidia L40s to be exact. This can be customized in the fly.toml vm settings. It will probably run on a standard Fly Machine because Ollama does leverage llama.cpp — but performance will be drastically reduced.

Scaling to Zero

By default, the app does scale-to-zero. This is recommended (especially with GPUs) to save on costs. When the app receives a new request from the proxy, the Machine will boot in ~3s with the Web UI server ready to serve requests in ~15s. Loading models into VRAM can take a bit longer, depending on the size of the model.

Having trouble?

Create an issue or ask a question here: https://community.fly.io/

About

Self-host a ChatGPT-style web interface for Ollama 🦙

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 72.5%
  • Dockerfile 27.5%