Skip to content

Latest commit

 

History

History
35 lines (25 loc) · 1.1 KB

README.md

File metadata and controls

35 lines (25 loc) · 1.1 KB

Using HunyuanDiT Inference with under 6GB GPU VRAM

Instructions

Running HunyuanDiT in under 6GB GPU VRAM is available now based on diffusers. Here we provide instructions and demo for your quick start.

The 6Glite version supports Nvidia Ampere architecture series graphics cards such as RTX 3070/3080/4080/4090, A100, and so on.

The only thing you need do is to install the following library:

pip install -U bitsandbytes
pip install git+https://github.com/huggingface/diffusers
pip install torch==2.0.0

Then you can enjoy your HunyuanDiT text-to-image journey under 6GB GPU VRAM directly!

Here is a demo for you.

cd HunyuanDiT

# Quick start
model_id=Tencent-Hunyuan/HunyuanDiT-v1.1-Diffusers-Distilled
prompt=一个宇航员在骑马
infer_steps=50
guidance_scale=6
python3 lite/inference.py ${model_id} ${prompt} ${infer_steps} ${guidance_scale}

Note: To use other features in hydit requires torch 1.13.1. In this case, you may need to downgrade your torch version.

pip install torch==1.13.1