-
-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API incompatibility with importlib.metadata
(or at least the API is not type-safe?)
#486
Comments
It's important to remember that My initial recommendation would be that third-party providers should depend on I wonder if If providers were to follow that recommendation, it would not only address the reported issue, but it would also obviate the need for #487. Alternatively, I wonder, too, about the long-term strategy for this project. Should it exist indefinitely? Should it be deprecated/retired and when? Are there other options to consider? |
That sentiment does not seem to be generally shared (for example see thread starting in pypa/pyproject-hooks#195 (comment)). One of the main concerns is non-deterministic behaviour at runtime.
I suppose the main advantage of keeping a separated project for That is more interesting while the API does not reach stability. Once the API is considered stable, would it make sense to keep the project separated for agility in terms of bugfixes, or at that point the gains are considered minimum? |
My objective with this recommendation was to reduce the number of variables making things more deterministic. As has been pointed out, this backport struggles because it's providing compatibility forward and backward and also because it's supporting two API surfaces (providers and consumers). If we reduce the variability of providers, asking them to have a strategy for how to behave given an environment, that reduces the problem space substantially and puts the control in the hands of the consumer and/or environment builder. That is, if there were guidance for providers and they follow that guidance, they are no longer responsible to manage the complexity.
The problem with stability is that you have to choose between stability and adaptability. The API has been largely stable since Python 3.10, with essentially minor tweaks and shoring up the API limitations. I've encountered at least half a dozen cases, however, where users have wanted newer behavior on older Pythons, such as the improved import time performance or support for symlinks. In addition to the main benefit of providing forward compatibility and the faster release cadence (preview of changes), this package provides test functionality that's not available in the stdlib (performance benchmarks, xfailed tests, integration tests, coverage checks, linting and formatting, type checks). The developer ergonomics here are so much better than in CPython (to the extent that I'm dreaming of a world where all (or many) stdlib modules are developed independently and then later incorporated into stdlib). |
The idea of monkey-patching providers sounds fraught with complexity and risks, so it's probably a non-starter. I can imagine instead a scenario where at run time That way, providers don't need to do anything - they provide |
After working on documentation for the proposed approach, I'm now reconsidering the strategy (of replacing instances). I'm wondering if instead, both libraries should expose (compatible) Protocols for That's similar to the approach abravalheri originally proposed in #487, but using more public-facing interfaces that will also be published downstream in CPython. Another option could be to proceed with #505, addressing the needs of most consumers, and then deal with support for custom providers separately. Or! We could extend #505 to include new protocols, thereby making it easier for custom providers to implement the requisite interfaces more simply alongside the newer warnings (while still getting the benefits of converting incompatible classes to compatible ones). Now I'm leaning that direction. |
When 3rd-party
MetaPathFinder
are implemented usingimportlib.metadata
, the return types inimportlib_metadata
are not respected, which causes the API to behave in an unexpected way (with unexpected errors).This is an example similar to the one identified in pypa/pyproject-hooks#195 (comment) and pypa/setuptools#4338:
The expected behaviour as per API documentation would be:
It seems that the origin of this problem is a little "lie" in the API definition:
Instead of
importlib_metadata.Distribution.discover(...) -> Iterable[importlib_metadata.Distribution]
what actually happens is:
importlib_metadata.Distribution.discover(...) -> Iterable[importlib_metadata.Distribution | importlib.metadata.Distribution]
and that propagates throughout the whole API.
I haven't tested, but there is potential for other internal errors too, if internally
importlib_metadata
is relying that the objects will have typeimportlib_metadata.Distribution
to call newer APIs.It is probably worthy to change the return type of
importlib_metadata.Distribution.discover(...)
toIterable[importlib_metadata.Distribution | importlib.metadata.Distribution]
and then run the type checkers on the lowest Python supported (I suppose Python 3.8), to see if everything is OK.It also means that consumers of
importlib_metadata
cannot rely on the newer APIs (unless they are sure that 3r-party packages installed in their environment are not usingimportlib.metadata
).The text was updated successfully, but these errors were encountered: