Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add geostrophic convergence test (Williamson Test 2) #120

Merged
merged 6 commits into from
Oct 16, 2023

Conversation

xylar
Copy link
Collaborator

@xylar xylar commented Sep 23, 2023

This merge adds the geostrophic convergence test case from Williamson et al. 1992. The test case starts out in geostrophic balance and ideally remains in steady state.

The test case has similarities to cosine_bell so efforts have been made to share code, primarily the forward step, between the two.

Checklist

  • User's Guide has been updated
  • Developer's Guide has been updated
  • API documentation in the Developer's Guide (api.md) has any new or modified class, method and/or functions listed
  • Documentation has been built locally and changes look as expected
  • Testing comment in the PR documents testing used to verify the changes
  • New tests have been added to a test suite

@xylar xylar added enhancement New feature or request in progress This PR is not ready for review or merging ocean Related to ocean tests or analysis labels Sep 23, 2023
@xylar xylar self-assigned this Sep 23, 2023
@xylar
Copy link
Collaborator Author

xylar commented Sep 23, 2023

This is based off of #119 and will be rebased after that gets merged.

@xylar
Copy link
Collaborator Author

xylar commented Sep 23, 2023

Thanks to @lconlon for starting out this test case with me in the Polaris Hackathon in May!

@xylar xylar force-pushed the add-geostrophic-tests branch 12 times, most recently from aac2bcf to 256f6df Compare September 24, 2023 12:46
@xylar
Copy link
Collaborator Author

xylar commented Sep 24, 2023

Testing

I have run all 4 geostrophic test cases on Chrysalis with Intel and OpenMPI. They ran successfully and produced the expected plots.

The order of convergence for water-column thickness for the QU grids is really terrible -- 0.4. So that's discoraging for MPAS-Ocean but not an indication of a problem with the tests. The Icos meshes do much better -- about 1.6.

QU:
convergence_h
convergence_vel

Icos:
convergence_h
convergence_vel

@xylar
Copy link
Collaborator Author

xylar commented Sep 24, 2023

Here are some plots of water-column thickness, u, v and normal velocity (for the icos 240km mesh):

final_h
final_u
final_v
final_norm_vel

@xylar
Copy link
Collaborator Author

xylar commented Sep 24, 2023

Here are diff plots of water-column thickness for the QU and Icos meshes with increasing resolution.

QU

diff_h
diff_h
diff_h
diff_h
diff_h
diff_h
diff_h

Icos

diff_h
diff_h
diff_h
diff_h

@xylar xylar force-pushed the add-geostrophic-tests branch 4 times, most recently from 2c32289 to 17bd0fc Compare October 3, 2023 13:30
@xylar xylar requested review from cbegeman and sbrus89 October 3, 2023 13:31
@xylar xylar removed the in progress This PR is not ready for review or merging label Oct 3, 2023
@xylar xylar marked this pull request as ready for review October 3, 2023 13:31
@xylar
Copy link
Collaborator Author

xylar commented Oct 3, 2023

I need to update this to use #126, so it will likely make sense to wait on reviewing this until after that PR has gone in.

@xylar
Copy link
Collaborator Author

xylar commented Oct 11, 2023

@sbrus89, sorry for the confusion on my part yesterday. I was too tired I guess and I missed that this was with a test merge. I have rebased and fixed thing up. Retesting now but I think you can give it another try when you have time.

@xylar xylar force-pushed the add-geostrophic-tests branch from 5e258d8 to 8088f7f Compare October 11, 2023 12:45
@xylar xylar force-pushed the add-geostrophic-tests branch from 8088f7f to 80b036b Compare October 11, 2023 12:46
@sbrus89
Copy link
Contributor

sbrus89 commented Oct 11, 2023

@xylar, no problem. Yes I was testing with a local merge with main. I'll give it another go today. Thanks for rebasing!

@xylar
Copy link
Collaborator Author

xylar commented Oct 11, 2023

I successfully tested the convergence suite with the geostrophic tests added:

Task Runtimes:
0:01:01 PASS ocean/planar/inertial_gravity_wave
0:00:55 FAIL ocean/planar/manufactured_solution
0:05:27 PASS ocean/spherical/icos/cosine_bell
0:09:15 PASS ocean/spherical/qu/cosine_bell
0:01:35 PASS ocean/spherical/icos/geostrophic
0:03:02 PASS ocean/spherical/qu/geostrophic
Total runtime: 0:21:16

I was having some performance problems with the geostrphic/with_viz tests that seem random but also persistent. I don't know if it's a Chrysalis problem, an issue wiht ESMF or something else. But I also hope we can replace this viz soon with something better so I'm not inclined to investigate too much.

@sbrus89
Copy link
Contributor

sbrus89 commented Oct 12, 2023

@xylar, I'm seeing this error:

ocean/spherical/icos/geostrophic
Traceback (most recent call last):
  File "/global/cfs/cdirs/e3sm/sbrus/mambaforge/envs/dev_polaris_0.2.0/bin/polaris", line 33, in <module>
    sys.exit(load_entry_point('polaris', 'console_scripts', 'polaris')())
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/__main__.py", line 62, in main
    commands[args.command]()
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/run/serial.py", line 196, in main
    run_tasks(suite_name='task', quiet=args.quiet, is_task=True,
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/run/serial.py", line 103, in run_tasks
    result_str, success, task_time = _log_and_run_task(
                                     ^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/run/serial.py", line 316, in _log_and_run_task
    task.steps_to_run = _update_steps_to_run(
                        ^^^^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/run/serial.py", line 222, in _update_steps_to_run
    step_str = config.get(task_name, 'steps_to_run').replace(',', ' ')
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/mambaforge/envs/dev_polaris_0.2.0/lib/python3.11/site-packages/mpas_tools/config.py", line 115, in get 
    return self.combined.get(section, option)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/mambaforge/envs/dev_polaris_0.2.0/lib/python3.11/configparser.py", line 797, in get 
    d = self._unify_values(section, vars)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/mambaforge/envs/dev_polaris_0.2.0/lib/python3.11/configparser.py", line 1168, in _unify_values
    raise NoSectionError(section) from None
configparser.NoSectionError: No section: 'icos_geostrophic'

I looked in the geostrophic.cfg file and only see a icos_geostrophic_with_viz section. I have a feeling this is because I set up the cases in this order: ocean/spherical/icos/geostrophic, ocean/spherical/icos/geostrophic/with_viz before running ocean/spherical/icos/geostrophic. Is this expected behavior?

@xylar
Copy link
Collaborator Author

xylar commented Oct 12, 2023

@sbrus89, that does sound like a bug. Can you send me the commands you used to set up the 2 tasks? I want to make sure I can reproduce it. I always have been setting up both tasks together.

@sbrus89
Copy link
Contributor

sbrus89 commented Oct 12, 2023

@xylar - I did the following:

polaris setup -t ocean/spherical/icos/geostrophic -w $WORKROOT/wiliamson
polaris setup -t ocean/spherical/icos/geostrophic/with_viz -w $WORKROOT/wiliamson 
polaris setup -t ocean/spherical/qu/geostrophic -w $WORKROOT/wiliamson
polaris setup -t ocean/spherical/qu/geostrophic/with_viz -w w $WORKROOT/wiliamson

and then

cd $WORKROOT/wiliamson/ocean/spherical/icos/geostrophic
sbatch job_script.sh

@xylar
Copy link
Collaborator Author

xylar commented Oct 12, 2023

@sbrus89, great, thanks! I'm looking into this now.

Just as a tip to make your workflow more efficient, you may want to do:

polaris list

and then:

polaris setup -n 17 18 19 20 -w $WORKROOT/wiliamson

This will set up a custom suite with all 4 tasks. But it doesn't seem to expose the problem you showed so I'm glad you did things the "more tedious" way.

@xylar
Copy link
Collaborator Author

xylar commented Oct 12, 2023

@sbrus89, I understand now what is going wrong and I don't have a good idea what to do about it yet. It isn't anything to do with this PR, it was caused by #125. When you separately set up 2 tasks that share the same config file, the first task will create a section, option and value with its steps_to_run. Then, the second task will overwrite the config file, including its section, option and value with its steps_to_run. If you were to set them up together, both sections, options and values would be populated as excepted:

[icos_geostrophic]

# A list of steps to include when running the icos_geostrophic task
# source: /gpfs/fs1/home/ac.xylar/e3sm_work/polaris/add-geostrophic-tests/polaris/setup.py
steps_to_run = icos_base_mesh_60km icos_init_60km icos_forward_60km icos_base_mesh_120km icos_init_120km icos_forward_120km icos_base_mesh_240km icos_init_240km icos_forward_240km icos_base_mesh_480km icos_init_480km icos_forward_480km analysis


[icos_geostrophic_with_viz]

# A list of steps to include when running the icos_geostrophic_with_viz task
# source: /gpfs/fs1/home/ac.xylar/e3sm_work/polaris/add-geostrophic-tests/polaris/setup.py
steps_to_run = icos_base_mesh_60km icos_init_60km icos_forward_60km icos_map_cell_60km icos_map_edge_60km icos_viz_60km icos_base_mesh_120km icos_init_120km icos_forward_120km icos_map_cell_120km icos_map_edge_120km icos_viz_120km icos_base_mesh_240km icos_init_240km icos_forward_240km icos_map_cell_240km icos_map_edge_240km icos_viz_240km icos_base_mesh_480km icos_init_480km icos_forward_480km icos_map_cell_480km icos_map_edge_480km icos_viz_480km analysis

Presumably, the solution will be to include steps_to_run from all tasks when writing out a shared config file. Since this is unrelated to geostrophic, I'll address it in a separate PR.

@xylar
Copy link
Collaborator Author

xylar commented Oct 12, 2023

@sbrus89, would you like to review #134 and for me to merge that and rebase? Or can you review this without that fix?

@sbrus89
Copy link
Contributor

sbrus89 commented Oct 12, 2023

@xylar, I figured it wasn't related to this PR. I'll just review it without the fix. I'll try it by creating a custom suite as you suggested.

Just a thought, it would be nice to be able to create custom suites with the test case names as opposed to numbers since those change as new tests are added.

@xylar
Copy link
Collaborator Author

xylar commented Oct 13, 2023

@sbrus89, that's a good idea and easy to implement. I'll make a PR shortly.

@sbrus89
Copy link
Contributor

sbrus89 commented Oct 16, 2023

@xylar, do you have any idea why I'd be seeing this:

Exception raised while running the steps of the task
Traceback (most recent call last):
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/run/serial.py", line 326, in _log_and_run_task
    baselines_passed = _run_task(task, available_resources)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/run/serial.py", line 407, in _run_task
    _run_step(task, step, task.new_step_log_file,
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/run/serial.py", line 492, in _run_step
    step.runtime_setup()
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/model_step.py", line 373, in runtime_setup
    self._process_streams(quiet=quiet, remove_unrequested=False)
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/model_step.py", line 610, in _process_streams
    new_tree = yaml_to_mpas_streams(processed_registry_filename,
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/polaris/yaml.py", line 224, in yaml_to_mpas_streams
    with open(processed_registry_filename, 'r') as reg_file:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/e3sm_submodules/E3SM-Project/components/mpas-ocean/src/Registry_processed.xml'

It seems like something is going on with creating the path to the Registry. Part of the path to the worktree is being pre-pended to the full correct path: /global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/global/cfs/cdirs/e3sm/sbrus/polaris_worktrees/williamson/e3sm_submodules/E3SM-Project/components/mpas-ocean/src/Registry_processed.xml

@xylar
Copy link
Collaborator Author

xylar commented Oct 16, 2023

@sbrus89, hmm, that looks like a bug where a relative path didn't get turned into an absolute path as it should have before it got stored in the pickle file. I have never seen that before, so I would need to know exactly what your polaris setup or polaris suite command was that led to this. Maybe it was a simple as not supplying -p, which I basically always do?

@sbrus89
Copy link
Contributor

sbrus89 commented Oct 16, 2023

Sounds like it could be from not supplying -p I don't typically do this unless I want an ocean_model outside the repo.

@xylar
Copy link
Collaborator Author

xylar commented Oct 16, 2023

I nearly always use -p ../main/e3sm_submodules/E3SM-Project/components/mpas-ocean so I don't have to rebuild MPAS-Ocean in each worktree I use.

@xylar
Copy link
Collaborator Author

xylar commented Oct 16, 2023

@sbrus89, I'm not able to reproduce this with main even without the -p flag. I doubt I introduced this issue in this PR. Could you give me the exact polaris setup or polaris suite command?

@xylar
Copy link
Collaborator Author

xylar commented Oct 16, 2023

Oh, I take that back! I see it only in some tests and not others:

0:00:08 PASS ocean/planar/baroclinic_channel/10km/threads
0:00:03 PASS ocean/planar/baroclinic_channel/10km/decomp
0:00:00 FAIL ocean/planar/baroclinic_channel/10km/restart
0:00:01 FAIL ocean/planar/inertial_gravity_wave
0:00:02 PASS ocean/single_column/cvmix
0:00:02 PASS ocean/single_column/ideal_age

Again, not related to this PR but an important bug to fix.

@xylar
Copy link
Collaborator Author

xylar commented Oct 16, 2023

@sbrus89, I see what changed and why. It's complicated to explain (to do with fancy config parsing and automatically converting to absolute paths) but probably easy to fix. Thanks for bringing this to my attention. Can you proceed with using the -p flag for now?

@sbrus89
Copy link
Contributor

sbrus89 commented Oct 16, 2023

@xylar - yes, will do. Sorry for the headache on this!

Copy link
Contributor

@sbrus89 sbrus89 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great to me. I successfully ran the new geostrophic cases on Perlmutter. Thanks for your work on this and for addressing these mostly unrelated issues @xylar!

@xylar
Copy link
Collaborator Author

xylar commented Oct 16, 2023

Thanks so much, @sbrus89! And I really do appreciate you finding those issues, whether they are related or not.

@xylar xylar merged commit 278aea0 into E3SM-Project:main Oct 16, 2023
5 checks passed
@xylar xylar deleted the add-geostrophic-tests branch October 16, 2023 19:17
@xylar
Copy link
Collaborator Author

xylar commented Oct 16, 2023

@cbegeman, thank you very much for getting this test case started with excellent documentation and for thorough review! @lconlon, thank you for helping me get this test case started during the hackathon!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request ocean Related to ocean tests or analysis
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants