Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MontePy Submission #205

Open
19 of 32 tasks
MicahGale opened this issue Jul 1, 2024 · 37 comments
Open
19 of 32 tasks

MontePy Submission #205

MicahGale opened this issue Jul 1, 2024 · 37 comments
Assignees

Comments

@MicahGale
Copy link

MicahGale commented Jul 1, 2024

Submitting Author: @MicahGale
All current maintainers: @MicahGale, @tjlaboss
Package Name: MontePy
One-Line Description of Package: MontePy is a python library for reading, editing, and writing MCNP input files.
Repository Link: https://github.com/idaholab/MontePy
Version submitted: 0.5.2
EiC: @cmarmo
Editor: @kellyrowland
Reviewer 1: @munkm
Reviewer 2: @jpmorgan98
Archive: TBD
JOSS DOI: TBD
Version accepted: TBD
Date accepted (month/day/year): TBD


Code of Conduct & Commitment to Maintain Package

Description

  • Include a brief paragraph describing what your package does:

MontePy is a Python library for reading, editing, and writing MCNP input files. MCNP is the Monte Carlo N-Particle radiation transport code that supports 37 particle types, and is widely used in Nuclear Engineering, and Medical Physics. MontePy provides an object-oriented interface for MCNP input files. This allows for easy automation of many different tasks for working with MCNP input files. MontePy does not support MCNP output files

Scope

  • Please indicate which category or categories.
    Check out our package scope page to learn more about our
    scope. (If you are unsure of which category you fit, we suggest you make a pre-submission inquiry):

    • Data retrieval
    • Data extraction
    • Data processing/munging
    • Data deposition
    • Data validation and testing
    • Data visualization1
    • Workflow automation
    • Citation management and bibliometrics
    • Scientific software wrappers
    • Database interoperability

Domain Specific

  • Geospatial
  • Education

Community Partnerships

If your package is associated with an
existing community please check below:

  • For all submissions, explain how and why the package falls under the categories you indicated above. In your explanation, please address the following points (briefly, 1-2 sentences for each):

    • Who is the target audience and what are scientific applications of this package?

      • Scientists and engineers who use MCNP and know python are the primary audience. This will be mostly nuclear engineers, and medical physicists. Use cases are:

      • Automating tedious updates of simulation models (e.g., renumbering all materials to merge two models)

      • Automating generating many permutations of the model for optimization, sensitivity analysis, etc.

      • Extracting information from an existing model in a more legible way.

    • Are there other Python packages that accomplish the same thing? If so, how does yours differ?

      • There is also MCNPy, which reports to provide similar features. I have been unable to verify this as I have been unable to install it. MontePy is different by being written purely in python, and not java, and having a publicly accessible github, that anyone can open an issue for.
    • If you made a pre-submission enquiry, please paste the link to the corresponding issue, forum post, or other discussion, or @tag the editor you contacted:

Technical checks

For details about the pyOpenSci packaging requirements, see our packaging guide. Confirm each of the following by checking the box. This package:

  • does not violate the Terms of Service of any service it interacts with.
  • uses an OSI approved license.
  • contains a README with instructions for installing the development version.
  • includes documentation with examples for all functions.
  • contains a tutorial with examples of its essential functions and uses.
  • has a test suite.
  • has continuous integration setup, such as GitHub Actions CircleCI, and/or others.

Publication Options

JOSS Checks
  • The package has an obvious research application according to JOSS's definition in their submission requirements. Be aware that completing the pyOpenSci review process does not guarantee acceptance to JOSS. Be sure to read their submission requirements (linked above) if you are interested in submitting to JOSS.
  • The package is not a "minor utility" as defined by JOSS's submission requirements: "Minor ‘utility’ packages, including ‘thin’ API clients, are not acceptable." pyOpenSci welcomes these packages under "Data Retrieval", but JOSS has slightly different criteria.
  • The package contains a paper.md matching JOSS's requirements with a high-level description in the package root or in inst/.
  • The package is deposited in a long-term repository with the DOI:

Note: JOSS accepts our review as theirs. You will NOT need to go through another full review. JOSS will only review your paper.md file. Be sure to link to this pyOpenSci issue when a JOSS issue is opened for your package. Also be sure to tell the JOSS editor that this is a pyOpenSci reviewed package once you reach this step.

Are you OK with Reviewers Submitting Issues and/or pull requests to your Repo Directly?

This option will allow reviewers to open smaller issues that can then be linked to PR's rather than submitting a more dense text based review. It will also allow you to demonstrate addressing the issue via PR links.

  • Yes I am OK with reviewers submitting requested changes as issues to my repo. Reviewers will then link to the issues in their submitted review.

Confirm each of the following by checking the box.

  • I have read the author guide.
  • I expect to maintain this package for at least 2 years and can help find a replacement for the maintainer (team) if needed.

Please fill out our survey

P.S. Have feedback/comments about our review process? Leave a comment here

Editor and Review Templates

The editor template can be found here.

The review template can be found here.

Footnotes

  1. Please fill out a pre-submission inquiry before submitting a data visualization package.

@cmarmo
Copy link
Member

cmarmo commented Jul 9, 2024

Editor in Chief checks

Hi @MicahGale ! Thank you for submitting your package for pyOpenSci review.
Below are the basic checks that your package needs to pass to begin our review.
If some of these are missing, we will ask you to work on them before the review process begins.

Please check our Python packaging guide for more information on the elements
below.

  • Installation The package can be installed from a community repository such as PyPI (preferred), and/or a community channel on conda (e.g. conda-forge, bioconda).
    • The package imports properly into a standard Python environment import package.
  • Fit The package meets criteria for fit and overlap.
  • Documentation The package has sufficient online documentation to allow us to evaluate package function and scope without installing the package. This includes:
    • User-facing documentation that overviews how to install and start using the package.
    • Short tutorials that help a user understand how to use the package and what it can do for them.
    • API documentation (documentation for your code's functions, classes, methods and attributes): this includes clearly written docstrings with variables defined using a standard docstring format.
  • Core GitHub repository Files
    • README The package has a README.md file with clear explanation of what the package does, instructions on how to install it, and a link to development instructions.
    • Contributing File The package has a CONTRIBUTING.md file that details how to install and contribute to the package.
    • Code of Conduct The package has a CODE_OF_CONDUCT.md file.
    • License The package has an OSI approved license.
      NOTE: We prefer that you have development instructions in your documentation too.
  • Issue Submission Documentation All of the information is filled out in the YAML header of the issue (located at the top of the issue template).
  • Automated tests Package has a testing suite and is tested via a Continuous Integration service.
  • Repository The repository link resolves correctly.
  • Package overlap The package doesn't entirely overlap with the functionality of other packages that have already been submitted to pyOpenSci.
  • Archive (JOSS only, may be post-review): The repository DOI resolves correctly.
  • Version (JOSS only, may be post-review): Does the release version given match the GitHub release (v1.0.0)?

  • Initial onboarding survey was filled out
    We appreciate each maintainer of the package filling out this survey individually. 🙌
    Thank you authors in advance for setting aside five to ten minutes to do this. It truly helps our organization. 🙌


Editor comments

MontePy is in very good shape, congratulations to all the maintainers for your hard work!

Some comments about the checklist above.

  • In the documentation about the installation of a specific version : I strongly recommend to remove any manual process creating symbolic links and installing requirements in a folder. Virtual environments should always be preferred and you can install specific versions/tags/branches or commit in them with
pip install git+https://github.com/idaholab/MontePy.git@<your-version/tag/branch/commit>
  • The README.md redirects to the documentation for details about the installation: may I suggest to explicitely add the command pip install? This will clarify that the package is available from pypi but not from any conda channel.
  • The code of conduct is linked in contributing.md (btw thanks for expliciting the contact e-mail) but the file is missing and the link gives a "404 not found".

Minor comments not needed to start the review:

  • the documentation generally moves from more readable contents to more technical and detailed contents: I would have put the API at the end of the documentation.
  • wearing my data processing engineer hat, I would be grateful to have a safe default option backupping original files when overwriting ... I was checking basic usage documentation and the behaviour of your writing function really scared me.... 😅

Final note: I enjoyed this quote from your documentation 🪄

Creating a new universe is very straight forward. You just need to initialize it with a new number

@MicahGale
Copy link
Author

MicahGale commented Jul 10, 2024

@cmarmo thank you for the feedback!

I have opened this PR: idaholab/MontePy#440 to address this feedback.

In the documentation about the installation of a specific version : I strongly recommend to remove any manual process creating symbolic links and installing requirements in a folder. Virtual environments should always be preferred and you can install specific versions/tags/branches or commit in them with

These directions are old and from when this was an internal tool. I just removed them and instead pointed to using pip install montepy==<version>. Most versions are on PyPI, and those that aren't, can't be because they don't have an OSS license yet. So if someone really wants that old of a version I think they will figure it out on their own how to manage.

The README.md redirects to the documentation for details about the installation: may I suggest to explicitely add the command pip install? This will clarify that the package is available from pypi but not from any conda channel.

Added the command.

The code of conduct is linked in contributing.md (btw thanks for expliciting the contact e-mail) but the file is missing and the link gives a "404 not found".

Corrected this and actually added a boilerplate code of conduct.

the documentation generally moves from more readable contents to more technical and detailed contents: I would have put the API at the end of the documentation.

Good point. I updated the index to list the API documentation last.

wearing my data processing engineer hat, I would be grateful to have a safe default option backupping original files when overwriting ... I was checking basic usage documentation and the behaviour of your writing function really scared me.... 😅

This warning was written when MontePy used to discard user formatting and comments, which is no longer the case. write_to_file has no default option that would override the original file. I think now the only real risk with overwriting the original file is having a script that is buggy and accidentally changing the model. I changed the warning to discourage this sort of workflow for making script development easier.

After more consideration (mostly from others) I think this behavior should be changed, and an issue has been opened: idaholab/MontePy#442.

Final note: I enjoyed this quote from your documentation 🪄

Creating a new universe is very straight forward. You just need to initialize it with a new number

I forget sometimes about how silly the concept of universes are in these models is sometimes especially when working with them.

@cmarmo
Copy link
Member

cmarmo commented Jul 17, 2024

Thank you @MicahGale for your prompt response to my comments.
Let me know when your PR is merged so I can start looking for an editor.

@MicahGale
Copy link
Author

Ok this PR has been merged.

@cmarmo
Copy link
Member

cmarmo commented Jul 19, 2024

Thank you @MicahGale !
Your package looks ready for review to me.

I noticed that you submitted to JOSS independently (see openjournals/joss-reviews#6977): may I suggest to merge the two submissions, as pyOpenSci has a partnership with JOSS.
This will also lower the pression on our two communities, as both pyOpenSci and JOSS are based on volounteer engagement.
If you are ok with that I can comment on the related JOSS issue and we can follow up here.

@cmarmo cmarmo self-assigned this Jul 19, 2024
@lwasser lwasser moved this from pre-review-checks to seeking-editor in peer-review-status Jul 19, 2024
@MicahGale
Copy link
Author

Yes let's merge them if that makes sense. I just did things in a bit of a different order.

@cmarmo
Copy link
Member

cmarmo commented Aug 13, 2024

Hello @MicahGale , I'm glad to announce that @kellyrowland has accepted to be editor for the MontePy review.
Thank you so much Kelly!

I'm letting her introduce herself here and I wish to all of you a happy review process! 🚀 :

@cmarmo cmarmo removed their assignment Aug 13, 2024
@lwasser lwasser moved this from seeking-editor to under-review in peer-review-status Aug 13, 2024
@kellyrowland
Copy link
Collaborator

Hi -

This is my first engagement with pyOpenSci, so thanks in advance for your patience. 😅 I've been an editor for JOSS for a few years, and that's how we've arrived here.

@MicahGale before I get started on finding reviewers, I see there are a set of JOSS-related boxes to tick off - can you take a look at those and check them off/open PRs/etc. and let me know about the status of those items?

Thanks for tagging some possible reviewers over in openjournals/joss-reviews#6977 - I'll ping folks in this issue and make a post with the editor template once we've got two reviewers on board.

-Kelly

@MicahGale
Copy link
Author

MicahGale commented Aug 16, 2024

Thank you for being willing to do this new role for this package. :)

Ok I updated the JOSS section accordingly.

The one concern I had was about getting a DOI for archiving the software. Under the JOSS guidelines it seems like that's a final step?

Upon successful completion of the review, authors will make a tagged release of the software, and deposit a copy of the repository with a data-archiving service such as Zenodo or figshare, get a DOI for the archive, and update the review issue thread with the version number and DOI.

Are you alright with following the JOSS order for this?

@kellyrowland
Copy link
Collaborator

Good point, thanks. I think archiving the release and getting a DOI is a logical last step since it's often the case that changes are made to the software during the review process.

@kellyrowland
Copy link
Collaborator

@cmarmo it looks like the remaining "Core GitHub repository Files" item is set - could you please take a look and check that off at your earliest convenience? I think I should be set to ping potential reviewers at that point.

@cmarmo
Copy link
Member

cmarmo commented Aug 20, 2024

Done! Thank you Kelly!

@kellyrowland
Copy link
Collaborator

hi @paulromano @munkm 👋 would you be interested in and available to review this pyOpenSci submission?

the reviewer template that you would use can be seen at https://www.pyopensci.org/software-peer-review/appendices/templates.html#peer-review-template .

if you're not available for the review, could you suggest other potential reviewers for the package?

@munkm
Copy link

munkm commented Aug 22, 2024

I would love to! But I won't be able to review until after September 15th. Will that be an issue? If it is, I'll suggest an alternate.

@kellyrowland
Copy link
Collaborator

@MicahGale does the above timeline work for you?

@MicahGale
Copy link
Author

Yes, @kellyrowland, @munkm that timeline works me.

@kellyrowland
Copy link
Collaborator

hello @jpmorgan98 @gwenchee 👋 would you be interested in and available to review this pyOpenSci submission?

the reviewer template that you would use can be seen at https://www.pyopensci.org/software-peer-review/appendices/templates.html#peer-review-template .

@jpmorgan98
Copy link

@kellyrowland I am willing! However, I will not be available until Sept. 31st. Let me know if that's an issue

@MicahGale
Copy link
Author

@jpmorgan98, I am fine with that. I understand end of the federal fiscal year crunch time.

@kellyrowland
Copy link
Collaborator

thanks all! I think we can assign the reviewers and then I'll check back in...

@cmarmo is that something you could do, edit the anchor post to add the reviewers? I don't think I'm able to as a drop-in editor here.

@kellyrowland
Copy link
Collaborator

@MicahGale @munkm @jpmorgan98 @cmarmo just FYI I will be out of office starting September 30 and will return on October 21.

@MicahGale
Copy link
Author

@munkm, @jpmorgan98 I just wanted to check in with you both and see if you might have some time for this review now that we are in a new fiscal year.

@jpmorgan98
Copy link

@MicahGale I had this penciled in for Monday of next week. If you'd like I can move some stuff around and start tomorrow evening, sorry about the delays and thanks for your understanding!

@MicahGale
Copy link
Author

No that's great. I just wanted to check in.

@jpmorgan98
Copy link

jpmorgan98 commented Oct 16, 2024

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide

  • As the reviewer I confirm that there are no conflicts of interest for me to review this work (If you are unsure whether you are in conflict, please speak to your editor before starting your review).

Documentation

The package includes all the following forms of documentation:

  • A statement of need clearly stating problems the software is designed to solve and its target audience in README.
  • Installation instructions: for the development version of the package and any non-standard dependencies in README.
  • Vignette(s) demonstrating major functionality that runs successfully locally.
  • Function Documentation: for all user-facing functions.
  • Examples for all user-facing functions.
  • Community guidelines including contribution guidelines in the README or CONTRIBUTING.
  • Metadata including author(s), author e-mail(s), a url, and any other relevant metadata e.g., in a pyproject.toml file or elsewhere.

Readme file requirements
The package meets the readme requirements below:

  • Package has a README.md file in the root directory.

The README should include, from top to bottom:

  • The package name
  • Badges for:
    • Continuous integration and test coverage,
    • Docs building (if you have a documentation website),
    • A repostatus.org badge,
    • Python versions supported,
    • Current package version (on PyPI / Conda).

NOTE: If the README has many more badges, you might want to consider using a table for badges: see this example. Such a table should be more wide than high. (Note that the a badge for pyOpenSci peer-review will be provided upon acceptance.)

  • Short description of package goals.
  • Package installation instructions
  • Any additional setup required to use the package (authentication tokens, etc.)
  • Descriptive links to all vignettes. If the package is small, there may only be a need for one vignette which could be placed in the README.md file.
    • Brief demonstration of package usage (as it makes sense - links to vignettes could also suffice here if package description is clear)
  • Link to your documentation website.
  • If applicable, how the package compares to other similar packages and/or how it relates to other packages in the scientific ecosystem.
  • Citation information

Usability

Reviewers are encouraged to submit suggestions (or pull requests) that will improve the usability of the package as a whole.
Package structure should follow general community best-practices. In general please consider whether:

  • Package documentation is clear and easy to find and use.
  • The need for the package is clear
  • All functions have documentation and associated examples for use
  • The package is easy to install

Functionality

  • Installation: Installation succeeds as documented.
  • Functionality: Any functional claims of the software been confirmed.
  • Performance: Any performance claims of the software been confirmed.
  • Automated tests:
    • All tests pass on the reviewer's local machine for the package version submitted by the author. Ideally this should be a tagged version making it easy for reviewers to install.
    • Tests cover essential functions of the package and a reasonable range of inputs and conditions.
  • Continuous Integration: Has continuous integration setup (We suggest using Github actions but any CI platform is acceptable for review)
  • Packaging guidelines: The package conforms to the pyOpenSci packaging guidelines.
    A few notable highlights to look at:
    • Package supports modern versions of Python and not End of life versions.
    • Code format is standard throughout package and follows PEP 8 guidelines (CI tests for linting pass)

For packages also submitting to JOSS

Note: Be sure to check this carefully, as JOSS's submission requirements and scope differ from pyOpenSci's in terms of what types of packages are accepted.

The package contains a paper.md matching JOSS's requirements with:

  • A short summary describing the high-level functionality of the software
  • Authors: A list of authors with their affiliations
  • A statement of need clearly stating problems the software is designed to solve and its target audience.
  • References: With DOIs for all those that have one (e.g. papers, datasets, software).

Final approval (post-review)

  • The author has responded to my review and made changes to my satisfaction. I recommend approving this package.

Estimated hours spent reviewing:
4

Review Comments

A very well developed package with well written clear documentation. It always makes me happy to see open source work supporting radiation transport! I am not the most avid MCNP user but have had my fair share of evenings lost to unknown syntax errors in my input decks, so it's exciting to see a project address that in a well developed and thought out way.

Doc strings are very well written and integrate nicely into a well thought out documentation cite. They seem to be associated with every function at all depths.

I have some comments expanding on unchecked boxes above.

Itemized issues
  • Version test not passing on local machine (github issue 572)
  • Demo notebook failing on local machine (github issue 571)
  • Specific requested badges (specifically: Docs building, repostatus.org badge, Python versions supported)
  • Vignette from README not working locally. I have no foo.imcnp to compare too, maybe putting one in the demo directory and linking it might be nice
  • Citation information needed at the bottom of the README. I am guessing this will be the JOSS paper once approved so just an acknowledgment that you'll do this once MontePy is accepted should suffice to check this off.
  • Noting similar packages. You mention in the paper ARMI, WATTS, and MCNPy but they should also be mentioned on your README with a quick note about how MontePy is different.

NOTE I do not have access to an installation of MCNP. So for the purposes of this review I cannot validate that the output decks that MontePy provides, work.

You mention in the paper some of the other packages that do this but are less general purpose. I think PyNE also has some limited functionality to do this, tho I could be mistaken. Also another thing of note is that the other LANL Monte Carlo rad transport app MCATK has a built in Python interface which only further demonstrates the need for something general purpose like this for MCNP.

I also noticed that you just did a version release to 0.5.0, first off congrats, second should we change which version we are reviewing?

Great job again on such a well developed package!

@MicahGale
Copy link
Author

Thank you @jpmorgan98 for your review and feedback!

re:version

I have been updating the submitted version with every release since this was started. However, I'm not sure if I should keep doing this though. Depending on what branch you were looking at you were either review 0.4.1 or 0.4.2devNNN (dev versions default to a patch release, so 0.4.2.devNNN became 0.5.0 on release because it was a minor release.) So I'll leave it to @kellyrowland on how we should deal with this once she is able to return.

re: Competitors.

Yes PyNE is definitely worth mentioning in the paper, and not just because Paul Wilson is my advisor. Since writing the first draft of this we have come across a lot more packages, and opened a lazy issue (433) to track these. This should be finalized prior to finalizing this review, right?

re: Validating if the MCNP input files work

Can't you just run python -m montepy -c :P. Yes this is a bit of a challenge sometimes and would require tedious manual review of the input files against the manual to see if the file makes sense. We do actually have a cyclical test where a file read-written-read to make sure nothing breaks. Obviously this isn't perfect, that it does catch a lot of low hanging fruit.

@kellyrowland
Copy link
Collaborator

Fair question on versioning - my experience with JOSS is that folks usually submit version X.Y which is reviewed, and then when any edits have gone in, we have the submitter cut a new release X.Z which contains the edits and then that gets an associated archive that goes with the publication. So, the JOSS publication is associated with some specific version of the software.

@MicahGale is there a pinned branch or version that you would preferably have the reviewers look at for being tagged along with the manuscript? For the JOSS review process it's generally not the case that the codebase gets a major overhaul for functionality, interface, etc. (often just manuscript edits), so any recent tag or version would be sufficient IMO. Let me know if you have any questions.

@MicahGale
Copy link
Author

I think it should be fine if everyone reviews develop. The latest release was a minor one that added a new feature, and fixed a few bugs. We won't do a major release during this review that would remove any current functionality. So I think just keeping up with the current releases should be fine, and will reduce the amount of duplicate issues that reviewers find.

@kellyrowland
Copy link
Collaborator

Sounds good to me, thanks.

@kellyrowland
Copy link
Collaborator

hi @munkm checking in here if you're able to get started on your review. if you need to hand it off, just let me know.

@munkm
Copy link

munkm commented Nov 4, 2024

Thanks for checking in! I'm around, just a little bit behind where I'd like to be. I will try to complete my review this week.

@MicahGale
Copy link
Author

@jpmorgan98, I believe we have addressed all of you concerns through #578, and #573. Could you double check our develop branch and make sure we did properly address them all?

@jpmorgan98
Copy link

Great job I just marked my approval above. Great job again!

@kellyrowland: does that finalize my review or is there anything else I need to do on my end. Thanks!

@kellyrowland
Copy link
Collaborator

@jpmorgan98 I think that's good here, but please stay tuned just in case anything else is needed in the wrap-up stages later on.

@munkm just checking in 👋 please let me know if you need to set this down.

@MicahGale
Copy link
Author

@kellyrowland, this week is the ANS Winter metting, and I saw that @munkm is chairing some sessions, so I don't think she'll be able to review this this week.

@kellyrowland
Copy link
Collaborator

Thanks for the note! I'm more active in the HPC space than the nuclear engineering space these days, so not so plugged into the immediate ANS calendar. 🙂

@munkm
Copy link

munkm commented Nov 25, 2024

@MicahGale you were right! Last week I was busy with ANS. I work on this this week. If not M-W, then before the Monday following the holiday. I'm sorry for my delay!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: under-review
Development

No branches or pull requests

5 participants