Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test minimum and maximum supported dependencies #918

Merged
merged 37 commits into from
Sep 27, 2024

Conversation

adamjstewart
Copy link
Collaborator

@adamjstewart adamjstewart commented Sep 4, 2024

With this PR, we test both the minimum and latest version of all dependencies on every commit. This ensures that the dependencies listed in pyproject.toml actually reflect the true range of supported versions.

Closes #916
Closes #894
Closes #849

requirements/test.txt Outdated Show resolved Hide resolved
@adamjstewart
Copy link
Collaborator Author

It would also be nice to remove pretrainedmodels and efficientnet-pytorch, as both projects were abandoned a long time ago. I think @isaaccorley's fork basically only had deps on timm and torch, which made it really easy to maintain.

@adamjstewart
Copy link
Collaborator Author

How would you feel about getting rid of all non-timm encoders? I think timm has everything pretrainedmodels and efficientnet-pytorch had and more, and it's actually maintained.

@adamjstewart
Copy link
Collaborator Author

Not sure how to debug the segfault issue. Maybe it's time to try to reduce the number of dependencies first.

@qubvel
Copy link
Collaborator

qubvel commented Sep 5, 2024

Looking through the github I see that most of the repos are using non-timm encoders. While I recommend using timm encoders, I think it's better to leave these dependencies and encoders for backward compatibility.
We can deprecate them and make dependencies optional after that, but I'm not sure it will be good to remove them at all.

@adamjstewart
Copy link
Collaborator Author

Can we at least remove pretrainedmodels? It hasn't been updated in 6 years. The field of deep learning has progressed a lot in 6 years...

@qubvel
Copy link
Collaborator

qubvel commented Sep 5, 2024

I'm not seeing any problem staying with pretrainedmodels, it's not breaking API because it's not updated and has soft requirements, but of course, we can remove it. A few models depend just on "pretrained_settings" that can be copied, other encoders might be hidden under optional dependency (inception-like, dpn).

@qubvel
Copy link
Collaborator

qubvel commented Sep 5, 2024

@adamjstewart thanks a lot for working on this! I appreciate your help and efforts to make the library easier to maintain 🙂

@adamjstewart adamjstewart force-pushed the tests/minimum branch 3 times, most recently from ee72928 to 01f41ac Compare September 12, 2024 08:56
@adamjstewart
Copy link
Collaborator Author

I think this is finally ready for review.

Note that these minimum versions are only for the 68% of our code that is covered by our unit tests. The other 32% of our code, or the documentation builds, are not currently tested. We really need to increase test coverage to ensure confidence in these dependency bounds.

@adamjstewart adamjstewart marked this pull request as ready for review September 12, 2024 11:41
]
dynamic = ['version']

[project.optional-dependencies]
docs = [
'autodocsumm',
'huggingface-hub',
'six==1.15.0',
'sphinx<7',
'sphinx-book-theme==1.1.2',
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably test that newer versions of sphinx and sphinx-book-theme actually work, but this requires #920.

run: |
python -m pip install --upgrade pip uv
python -m uv pip install ruff==0.5.2
run: python -m pip install -r requirements/test.txt
# Update output format to enable automatic inline annotations.
- name: Run Ruff Linter
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instead of hard running it here, what do you think about setting up the https://pre-commit.com/ here, then after you guys set up the https://pre-commit.ci into the repo, so it enables the auto-fix feature on PR's...

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have mixed feelings about this. I like the ability to auto-fix issues, just not auto-commits. I'm on the fence. We haven't enabled this in TorchGeo either.

@@ -0,0 +1,10 @@
efficientnet-pytorch==0.7.1
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why pin the required? Shouldn't keep it and install the latest version automatically?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file is only used for CI and is controlled by dependabot. The dependencies used during installation are found in pyproject.toml.

@adamjstewart
Copy link
Collaborator Author

@qubvel is this ready to merge, or are we waiting on additional test coverage?

@yohann84L
Copy link

Waiting for timm>=0.9.2, would be great to merge this PR @adamjstewart @qubvel 🥳

Copy link
Collaborator

@qubvel qubvel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to, let's merge it!

@adamjstewart adamjstewart merged commit 9e48177 into qubvel-org:main Sep 27, 2024
3 checks passed
@adamjstewart adamjstewart deleted the tests/minimum branch September 27, 2024 10:13
@brianhou0208
Copy link
Contributor

Hi @qubvel and @adamjstewart,

Thanks to @adamjstewart for contributing to make the library easier to maintain.
I agree that pretrainedmodels and efficientnet-pytorch are quite outdated.
I also agree with @qubvel that keeping these non-timm models doesn’t seem to have any negative impact, and they are relatively stable compared to timm (it hasn't been updated in six years).

Here are some of my thoughts:

  • efficientnet-pytorch: I think we can keep this one, as it only depends on torch and is quite clean.
  • pretrainedmodels: This repo integrates many classic models, but some of them actually rely on torchvision.
    • VGG, ResNet, DenseNet, etc., we could copy the "pretrained_settings", and we could add new pretrained weights from torchvision (link1, link2).
    • DPN, SENet, Xception, etc., we could reimplement in this repo.
  • torchvision: Given the dependency between torch and torchvision, maybe we could add more models like ConvNeXt, EfficientNetV2, etc.

I have attempted to reimplement the DPN model, maybe we can open a new PR to discuss this?

@adamjstewart
Copy link
Collaborator Author

adamjstewart commented Oct 19, 2024

I guess my stance is that just because these libraries are "stable" doesn't mean they are "useful". If the models they implement have now been added to timm, we could use the well-tested timm versions instead, allowing us to greatly simplify SMP. With that said, I don't personally have the time to rip them out and replace them. @isaaccorley did this in TorchSeg, so I'm hoping he can do it here too.

@adamjstewart
Copy link
Collaborator Author

RE: efficientnet-pytorch

I just spent an hour helping a student install SMP on Windows because pip failed when installing efficientnet-pytorch. If efficientnet-pytorch had wheels or used pyproject.toml, this issue would be easy to solve, but it hasn't had a new release in many years and is likely no longer supported. Timm also has efficientnet, efficientnet v2, and many other variants of efficientnet, so why do we need this dependency?

Same for pretrainedmodels and torchvision, all of these backbones are already available in timm. We could streamline installation of SMP by having only a couple dependencies without losing any features.

@brianhou0208
Copy link
Contributor

The comment I provided earlier is just my personal opinion. If needed, I can create a PR to help improve it.

My comment for keeping efficientnet-pytorch is simply because it only depends on PyTorch, without needing any additional packages. As long as it can be installed and run, efficientnet-pytorch should work without any issues.

Of course, torchseg is a great repository—very clean and straightforward. By the way, I also think mmit is excellent.

Ultimately, the final decision rests with @qubvel . I'm just sharing my thoughts. And of course, I also greatly appreciate @adamjstewart previous contributions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Is this library still maintained?
5 participants