-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test minimum and maximum supported dependencies #918
Conversation
It would also be nice to remove pretrainedmodels and efficientnet-pytorch, as both projects were abandoned a long time ago. I think @isaaccorley's fork basically only had deps on timm and torch, which made it really easy to maintain. |
How would you feel about getting rid of all non-timm encoders? I think timm has everything pretrainedmodels and efficientnet-pytorch had and more, and it's actually maintained. |
Not sure how to debug the segfault issue. Maybe it's time to try to reduce the number of dependencies first. |
Looking through the github I see that most of the repos are using non-timm encoders. While I recommend using timm encoders, I think it's better to leave these dependencies and encoders for backward compatibility. |
Can we at least remove pretrainedmodels? It hasn't been updated in 6 years. The field of deep learning has progressed a lot in 6 years... |
I'm not seeing any problem staying with |
@adamjstewart thanks a lot for working on this! I appreciate your help and efforts to make the library easier to maintain 🙂 |
ee72928
to
01f41ac
Compare
I think this is finally ready for review. Note that these minimum versions are only for the 68% of our code that is covered by our unit tests. The other 32% of our code, or the documentation builds, are not currently tested. We really need to increase test coverage to ensure confidence in these dependency bounds. |
] | ||
dynamic = ['version'] | ||
|
||
[project.optional-dependencies] | ||
docs = [ | ||
'autodocsumm', | ||
'huggingface-hub', | ||
'six==1.15.0', | ||
'sphinx<7', | ||
'sphinx-book-theme==1.1.2', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably test that newer versions of sphinx and sphinx-book-theme actually work, but this requires #920.
This reverts commit 17389a6.
4530792
to
57e55aa
Compare
run: | | ||
python -m pip install --upgrade pip uv | ||
python -m uv pip install ruff==0.5.2 | ||
run: python -m pip install -r requirements/test.txt | ||
# Update output format to enable automatic inline annotations. | ||
- name: Run Ruff Linter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
instead of hard running it here, what do you think about setting up the https://pre-commit.com/ here, then after you guys set up the https://pre-commit.ci into the repo, so it enables the auto-fix feature on PR's...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have mixed feelings about this. I like the ability to auto-fix issues, just not auto-commits. I'm on the fence. We haven't enabled this in TorchGeo either.
@@ -0,0 +1,10 @@ | |||
efficientnet-pytorch==0.7.1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why pin the required? Shouldn't keep it and install the latest version automatically?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This file is only used for CI and is controlled by dependabot. The dependencies used during installation are found in pyproject.toml.
@qubvel is this ready to merge, or are we waiting on additional test coverage? |
Waiting for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to, let's merge it!
Hi @qubvel and @adamjstewart, Thanks to @adamjstewart for contributing to make the library easier to maintain. Here are some of my thoughts:
I have attempted to reimplement the DPN model, maybe we can open a new PR to discuss this? |
I guess my stance is that just because these libraries are "stable" doesn't mean they are "useful". If the models they implement have now been added to timm, we could use the well-tested timm versions instead, allowing us to greatly simplify SMP. With that said, I don't personally have the time to rip them out and replace them. @isaaccorley did this in TorchSeg, so I'm hoping he can do it here too. |
RE: efficientnet-pytorch I just spent an hour helping a student install SMP on Windows because pip failed when installing efficientnet-pytorch. If efficientnet-pytorch had wheels or used pyproject.toml, this issue would be easy to solve, but it hasn't had a new release in many years and is likely no longer supported. Timm also has efficientnet, efficientnet v2, and many other variants of efficientnet, so why do we need this dependency? Same for pretrainedmodels and torchvision, all of these backbones are already available in timm. We could streamline installation of SMP by having only a couple dependencies without losing any features. |
The comment I provided earlier is just my personal opinion. If needed, I can create a PR to help improve it. My comment for keeping Of course, torchseg is a great repository—very clean and straightforward. By the way, I also think mmit is excellent. Ultimately, the final decision rests with @qubvel . I'm just sharing my thoughts. And of course, I also greatly appreciate @adamjstewart previous contributions. |
With this PR, we test both the minimum and latest version of all dependencies on every commit. This ensures that the dependencies listed in pyproject.toml actually reflect the true range of supported versions.
Closes #916
Closes #894
Closes #849