Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Share finding for compression experiment #132

Open
KIMGEONUNG opened this issue May 11, 2024 · 3 comments
Open

Share finding for compression experiment #132

KIMGEONUNG opened this issue May 11, 2024 · 3 comments

Comments

@KIMGEONUNG
Copy link

I am reaching out to share our recent intriguing findings from a compression experiment. Our results demonstrate a significant reduction in model size without compromising quality. Briefly, we discovered that restoration models derived from large T2I models seldom utilize the coarse layers of the UNet. By simply removing the network blocks beyond a predetermined depth in the skip-connection setup, we observed minimal impact on the results. Specifically, for Stable SR, only depth-level 9, which utilizes 60% of the parameters, is required to achieve high-quality restoration.
Here are the quantitative results from the DIV2K test set:
image

You can find more details about our research at the following link:
https://arxiv.org/abs/2401.17547

@cyy2427
Copy link

cyy2427 commented May 15, 2024

It looks cool. Code available?

@KIMGEONUNG
Copy link
Author

Maybe next week, I will provide the code
.

@KIMGEONUNG
Copy link
Author

KIMGEONUNG commented Jun 18, 2024

@cyy2427 ,@IceClear

Depth-skip pruning minimal implementation for StableSR is ready to the below link.
https://github.com/KIMGEONUNG/StableSR_Depth-skip?tab=readme-ov-file

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants