-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v1.0.0 release #490
Merged
Merged
v1.0.0 release #490
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
* Adding argo search and download script * Create get_argo.py Download the 'classic' argo data with physical variables only * begin implementing argo dataset * 1st draft implementing argo dataset * implement search_data for physical argo * doctests and general cleanup for physical argo query * beginning of BGC Argo download * parse BGC profiles into DF * plan to query BGC profiles * validate BGC param input function * order BGC params in order in which they should be queried * fix bug in parse_into_df() - init blank df to take in union of params from all profiles * identify profiles from initial API request containing all required params * creates df with only profiles that contain all user specified params Need to dload additional params * modified to populate prof df by querying individual profiles * finished up BGC argo download! * assert bounding box type in Argo init, begin framework for unit tests * Adding argo search and download script * Create get_argo.py Download the 'classic' argo data with physical variables only * begin implementing argo dataset * 1st draft implementing argo dataset * implement search_data for physical argo * doctests and general cleanup for physical argo query * beginning of BGC Argo download * parse BGC profiles into DF * plan to query BGC profiles * validate BGC param input function * order BGC params in order in which they should be queried * fix bug in parse_into_df() - init blank df to take in union of params from all profiles * identify profiles from initial API request containing all required params * creates df with only profiles that contain all user specified params Need to dload additional params * modified to populate prof df by querying individual profiles * finished up BGC argo download! * assert bounding box type in Argo init, begin framework for unit tests * need to confirm spatial extent is bbox * begin test case for available profiles * add tests for argo.py * add typing, add example json, and use it to test parsing * update argo to submit successful api request (update keys and values submitted) * first pass at porting argo over to metadata+per profile download (WIP) * basic working argo script * simplify parameter validation (ordered list no longer needed) * add option to delete existing data before new download * continue cleaning up argo.py * fix download_by_profile to properly store all downloaded data * remove old get_argo.py script * remove _filter_profiles function in favor of submitting data kwarg in request * start filling in docstrings * clean up nearly duplicate functions * add more docstrings * get a few minimal argo tests working * add bgc argo params. begin adding merge for second download runs * some changes * WIP test commit to see if can push to GH * WIP handling argo merge issue * update profile to df to return df and move merging to get_dataframe * merge profiles with existing df * clean up docstrings and code * add test_argo.py * add prelim test case for adding to Argo df * remove sandbox files * remove bgc argo test file * update variables notebook from development * simplify import statements * quickfix for granules error * draft subpage on available QUEST datasets * small reference fix in text * add reference to top of .rst file * test argo df merge * add functionality to Quest class to pass search criteria to all datasets * add functionality to Quest class to pass search criteria to all datasets * update dataset docstrings; reorder argo.py to match * implement quest search+download for IS2 * move spatial and temporal properties from query to genquery * add query docstring test for cycles,tracks to test file * add quest test module * standardize print outputs for quest search and download; is2 download needs auth updates * remove extra files from this branch * comment out argo portions of quest for PR * remove argo-branch-only init file * remove argo script from branch * remove argo test file from branch * comment out another line of argo stuff * Update quest.py Added Docstrings to functions within quest.py and edited the primary docstring for the QUEST class here. Note I did not add Docstrings to the implicit __self__ function. * Update test_quest.py Added comments (not Docstrings) to test functions * Update dataset.py Minor edits to the doc strings * Update quest.py Edited docstrings * catch error with downloading datasets in Quest; template test case for multi dataset query --------- Co-authored-by: Kelsey Bisson <[email protected]> Co-authored-by: Romina <[email protected]> Co-authored-by: zachghiaccio <[email protected]> Co-authored-by: Zach Fair <[email protected]>
* add OA API warning * comment out tests that use OA API --------- Co-authored-by: GitHub Action <[email protected]>
--------- Co-authored-by: GitHub Action <[email protected]>
* add filelist and product properties to Read object * deprecate filename_pattern and product class Read inputs * transition to data_source input as a string (including glob string) or list * update tutorial with changes and user guidance for using glob --------- Co-authored-by: Jessica Scheick <[email protected]>
* add kwarg acceptance for data queries and download_all in quest * Add QUEST dataset page to RTD --------- Co-authored-by: zachghiaccio <[email protected]>
Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com> Co-authored-by: Jessica Scheick <[email protected]>
Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com> Co-authored-by: Jessica Scheick <[email protected]>
Co-authored-by: allcontributors[bot] <46447321+allcontributors[bot]@users.noreply.github.com> Co-authored-by: Jessica Scheick <[email protected]>
Refactor Variables class to be user facing functionality
* expand extract_product and extract_version to check for s3 url * add cloud notes to variables notebook --------- Co-authored-by: Jessica Scheick <[email protected]>
- add argo.py dataset functionality and implementation through QUEST - demonstrate QUEST usage via example notebook - add save to QUEST DataSet class template Co-authored-by: Kelsey Bisson <[email protected]> Co-authored-by: Romina <[email protected]> Co-authored-by: zachghiaccio <[email protected]>
…e some formatting) (#479) Co-authored-by: GitHub Action <[email protected]>
Co-authored-by: Jessica Scheick <[email protected]>
- update pypi action to use OIDC trusted publisher mgmt - generalize the flake8 action to a general linting action and add black - put flake8 config parameters into a separate file (.flake8) - update versions of actions/pre-commit hooks - specify uml updates only need to run on PRs to development - do not run uml updates on PRs into main #449) - update docs config files to be compliant - temporarily ignore many flake8 error codes until legacy files are updated
…482) Co-authored-by: Rachel Wegener <[email protected]>
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
👈 Launch a binder notebook on this branch for commit b4361c6 I will automatically update this comment whenever this PR is modified |
Closed
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
No description provided.