Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

User specified number of significant figures for numerical outputs #283

Merged
merged 4 commits into from
Feb 5, 2024

Conversation

drbenvincent
Copy link
Collaborator

@drbenvincent drbenvincent added the enhancement New feature or request label Dec 22, 2023
@drbenvincent
Copy link
Collaborator Author

drbenvincent commented Dec 22, 2023

Pfft. These doctests are brittle. Despite thinking it was a great idea, at the moment they are causing more hassle than catching actual problems. Not sure what you think @jpreszler?

@jpreszler
Copy link
Contributor

There are definitely some that are particularly problematic, and will continue to be so without some significant changes. I think the value comes over the long-term, but it will be hard to get there without some improvements. The options I see:

  1. turn them off, at least temporarily.
  2. Split the example and output into separate docstrings (could be difficult for some longer examples). This would flag the biggest problems (an exception from input changes) but would make the actual example output static. This could be a good temporary fix in some places.
  3. Replace the more problematic examples. I generally grabbed things from the other docs which are nice realistic example but have coefficients near 0 or HDI's near or including 0. This is perhaps the biggest issue since the output may change sign in a way the makes even the pytest numeric accuracy checks too strict. It's likely possible to have examples that show the right output, but sacrifice realism for basic numeric stability that make the doctests far more reliable.

My vote would be to temporarily turn off the tests and replace the examples for at least PrePostFit (flakey), RegressionDiscontinuity (flakey) and the Instrumented Variables (slow) and turn doctests back on. I should be able to work on this soon, perhaps even getting better examples over the next week.

@jpreszler
Copy link
Contributor

created #284. For now, I think you can just comment out the doctest call in the CI untill that issue is resolved.

@drbenvincent
Copy link
Collaborator Author

Note to self: I'll finish and merge this PR after #286 is merged

Copy link

codecov bot commented Jan 5, 2024

Codecov Report

Attention: 7 lines in your changes are missing coverage. Please review.

Comparison is base (6283c76) 76.50% compared to head (7e41931) 76.50%.

❗ Current head 7e41931 differs from pull request most recent head 22f93ff. Consider uploading reports for the commit 22f93ff to get more accurate results

Files Patch % Lines
causalpy/pymc_experiments.py 41.66% 7 Missing ⚠️
Additional details and impacted files
@@           Coverage Diff           @@
##             main     #283   +/-   ##
=======================================
  Coverage   76.50%   76.50%           
=======================================
  Files          20       20           
  Lines        1345     1345           
=======================================
  Hits         1029     1029           
  Misses        316      316           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@drbenvincent
Copy link
Collaborator Author

drbenvincent commented Jan 5, 2024

Well, I guess I have to increase test coverage to make codecov pass.

Copy link
Collaborator

@juanitorduz juanitorduz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor change request ;) Otherwise LGTM.

causalpy/pymc_experiments.py Outdated Show resolved Hide resolved
causalpy/pymc_experiments.py Outdated Show resolved Hide resolved
@drbenvincent drbenvincent merged commit e79271b into main Feb 5, 2024
9 checks passed
@drbenvincent drbenvincent deleted the round_to_summary branch February 5, 2024 13:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

User specified number of significant figures for numerical result outputs
3 participants