-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GHA: include reverse dependency testing #639
Conversation
So far, this goes well 😂 ======================= reverse dependency tests summary =======================
PASSED:
FAILED: esda, geosnap, giddy, inequality, mapclassify, mgwr, momepy, pointpats, region, segregation, spaghetti, spglm, spint, splot, spopt, spreg, spvcm, tobler |
It seems that built-in datasets in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing work, @martinfleis!
As a followup, would it be prudent to migrate martinfleis/reverse-dependency-testing
into the pysal
org?
That is just a fork so I can work on it. It lives here https://github.com/scientific-python/reverse-dependency-testing so there will be no moving. Just a merge of my branch to main once I fix the parallelisation settings that doesn't seem to work now. |
Should we merge this prior to merging #638 ? |
No, let me merge the changes of the action before that, so we link to scientific-python repo. But we should merge it before the next release to ensure any of the downstream test failures are not caused by us. We should also go through them and report to downstream if they need to be fixed. There is also a number of tests failing because the test files are not available in the installed version of the packages. We can either fetch those over wire or decide on a federation-wide pytest.mark to be able to skip them here. |
Questions here: Do we want the verbose output or not? Maybe yes until we resolve downstream issues? |
I agree. At least verbose for now, but possibly in the future? Perhaps logging that provides a detailed report if failures occur? No idea if that's possible or not.
Do you mean when should it be triggered/scheduled? This is good question since it takes quite a while to run... It would be nice to have on each PR, but that may not be feasible. I'd say at least 2x a month though for a schedule? |
No idea how would I do that. Pytest can produce some JSON reports but how to create an artefact from those is beyond my current knowledge. I'd skip that for now.
What about every push to main and then together with the scheduled CI? I would not run it on every push to a branch/PR but main might be good. |
Same and agreed.
This sounds good to me. |
Okay. Gonna wait for scientific-python/reverse-dependency-testing-action#1 and then update the action and merge. |
This is very much WIP because to make this work we need to ensure that the state of CI in downstream PySAL packages is as it should be.
Once this works, it ensures that libpysal release does not break anything in the rest of the federation (xref #631). I'll see if there's some more work needed on the side of the action (I created the beta version during the Scientific Python summit, so have admin rights over it). But it will likely require changes in downstream (e.g. you can see that
tobler
does not properly skip H3 tests etc.).The trigger is temporary to make sure we can properly test it here. We can then discuss when this should run but it should run before any release for sure and in the optimal situation should be required to pass for a release.