Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to load custom onnx fp16 models (for example real drct gan)? That would be great improvement! #107

Open
zelenooki87 opened this issue Nov 10, 2024 · 9 comments

Comments

@zelenooki87
Copy link

zelenooki87 commented Nov 10, 2024

https://mega.nz/file/0gJwyIBA#fTdbXWb6zbWrQApg2VgNRbY_fh3wdy5f-mP4Oz1jVbU

Please add support this model for Super resolution task, cause it is SOTA.

@Djdefrag
Copy link
Owner

Hi my friend,

only the onnx file is not Isisnot enough to implement the model.

I would need the github project to better understand how to implement it

@zelenooki87
Copy link
Author

With this script it is working fine.
https://pastebin.com/DUAKpuF1
However if I rename model to you naming (example Bsrgan fp16) and changed parameters to float16, mode, your output tiling logic is not properly with this model. Output is blured slightly.

@Djdefrag
Copy link
Owner

Output is blured?

@zelenooki87
Copy link
Author

Not blury but not detailed output
1 Qualityscaler-DRCT
2 DRCT-myscript
as it should be.

@zelenooki87
Copy link
Author

Could you add support for Real DRCT?

@Djdefrag
Copy link
Owner

Hi my friend,

I was trying to replicate the project to convert it to onnx.

Where did you find the onnx file you posted? because in the project github I can't find it

@zelenooki87
Copy link
Author

Author removed finetuned model from google drive.
I opened issue here:
ming053l/DRCT#28
I have converted it to fp16 onnx cause it performed much faster than pth in chainner. Author later removed finetuned model as I said. Would be realy nice to add support for it.

@zelenooki87
Copy link
Author

Hi, @Djdefrag
Any news about support DRCT?
Thanks

@Djdefrag
Copy link
Owner

Djdefrag commented Nov 22, 2024

Hi my friend,

i tried to replicate the DRCT torch model and convert it to onnx but without success.

In any case, if you already have the onnx model you can make compatible with QualityScaler.

Essentially if you have the onnx model in fp32 mode you are already well on your way.
But there needs to be an additional step because QualityScaler is designed to have onnx fp16 models but with fp32 input.

To do this you can use the following code:

from onnxconverter_common import float16
model_fp32_path = f"{selected_AI_model}_fp32.onnx”
model_fp16_path = f"{selected_AI_model}_fp16.onnx”
    
loaded_model_fp32 = onnx.load(model_fp32_path)
model_fp16 = float16.convert_float_to_float16(model = loaded_model_fp32, keep_io_types=True, max_finite_val=1e5)
onnx.save(model_fp16, model_fp16_path)

where selected_AI_model = "-DCRT-something"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants