SD Chat is a helper app for setting up stable diffusion WebUI and Llama on your own computer. It's inspired by Diffusion Chat It's created from Tauri + Next.js Template.
English | 简体中文
- Check hardware conditions and show stats
- Install requirements and start WebUI with 1-click
- Pytorch 2.0
- Save your prompts
- Made with Rust and Nextjs and Typescript and TailwindCSS
- Multiple themes by DaisyUI
- Sidecar with Llama.cpp main executable
- Download the sdchat.msi from Releases and install it on Windows
- On the Quick Start, select the folder which is the stable diffusion WebUI located.
- Click Start WebUI button, and wait for it finishing running.
- If no error, the WebUI should be up and running, then click the Open WebUI button.
- Or, you click WebUI menu on the right sidebar.
- Click Chat menu, type to Chat with stable diffusion.
- Encoding problem When Chat with Llama. Calling for help!
- Only support Windows 10/11 with Nvidia GPU for now
- Openai API
- Manage your models
- Prompts Gallery
- Configurations
- AMD GPU
- MAC
- Linux
This is a Tauri project template using Next.js,
bootstrapped by combining create-next-app
and create tauri-app
.
This template uses pnpm
as the Node.js dependency
manager.
After cloning for the first time, set up git pre-commit hooks:
pnpm prepare
To develop and run the frontend in a Tauri window:
pnpm dev
This will load the Next.js frontend directly in a Tauri webview window, in addition to
starting a development server on localhost:3000
.
To export the Next.js frontend via SSG and build the Tauri application for release:
pnpm build
Please remember to change the bundle identifier in
tauri.conf.json > tauri > bundle > identifier
, as the default value will yield an
error that prevents you from building the application for release.
Next.js frontend source files are located in src/
and Tauri Rust application source
files are located in src-tauri/
. Please consult the Next.js and Tauri documentation
respectively for questions pertaining to either technology.