Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adcirc coupled with ww3 is not working #1

Open
uturuncoglu opened this issue Feb 12, 2024 · 21 comments
Open

adcirc coupled with ww3 is not working #1

uturuncoglu opened this issue Feb 12, 2024 · 21 comments
Labels
bug Something isn't working

Comments

@uturuncoglu
Copy link
Collaborator

Initial mail from @pvelissariou1:

I uploaded the simulation data and configs on orion:
/work/noaa/nosofs/pvelissa/for_ufuk_checking

I compared the ADCIRC outputs in both folders and they look identical, for example
nccmp -d florence_atm2adc_250m/run/[fort.61.nc](http://fort.61.nc/) florence_atm2adc2ww3_250m/run/[fort.61.nc](http://fort.61.nc/)

show no difference at all. Plotted the results, no difference at all.
The same for the other ADCIRC output files.

No errors in the PET*_LogFile files. It seems that the fields are connected.
Please take a look at these.

In the meantime I'll see what happens with the SCHISM RTs (w/wo waves)
@uturuncoglu uturuncoglu added the bug Something isn't working label Feb 12, 2024
@uturuncoglu uturuncoglu self-assigned this Feb 12, 2024
@uturuncoglu
Copy link
Collaborator Author

@pvelissariou1 @janahaddad

  • It seems that florence_atm2adc2ww3_250m is using ww3_shel.inp configuration file for WW3. Can you convert it to ww3_shel.nml based on the example in coastal_ike_shinnecock_atm2sch2ww3. At this point, I am not sure you are using coupled fields for wind and also current in WW3 side. In NML version, you are putting C for the coupled fields and I think T for data.

  • If you also check the florence_atm2adc2ww3_250m/run/PET000.ESMF_LogFile, you can see following information,

20240126 045914.106 INFO             PET000 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_ocn Field = So_u is not connected.
20240126 045914.106 INFO             PET000 (med_fldList_Realize)(med.F90:RealizeFieldsWithTransferProvided):Fr_ocn Field = So_v is not connected.

This means ADCIRC does not send currents to WW3. So maybe there is a field dictionary issue in here. It seems ADCIRC could get the radiation stress comonents from WW3, we still need to check the import and export states.

If you don’t mind could you set dbug = true in the ufs.configure file under OCN_attributes:: section and run the case again? If you could add following lines to your fd_ufs.yaml file that would be great. With this change you might able to connect u and v component of currents to WW3.

     - standard_name: So_u
       alias: surface_eastward_sea_water_velocity
     - standard_name: So_v
       alias: surface_northward_sea_water_velocity

Anyway, let me know when you run the case with the changes. Then I could look at it again.

@pvelissariou1
Copy link

@uturuncoglu WW3 gets the atm. fields. Take a look at florence_atm2adc2ww3_250m/run/out_grd.ww3.nc file for the variables UAX and UAY. It doesnt't matter if we use ww3_shel.inp or ww3_shel.nml according to WW3 developers they are processed the same way without any issues. Anyway I will try your suggestion. Let me check if I can get an output for radiation stresses from ADCIRC.

@uturuncoglu
Copy link
Collaborator Author

@pvelissariou1 I am not sure how coupling fields are defined in inp format. I could see T in the file but maybe those needs to be C. Anyway, lets follow the convention that is already used by the SCHSIM RT.

@pvelissariou1
Copy link

@uturuncoglu
In ufs.configure I use:

# OCN #
OCN_model:                      adcirc
OCN_petlist_bounds:             0 239
OCN_omp_num_threads:            1
OCN_attributes::
  Verbosity = 0
  DumpFields = false
  ProfileMemory = false
  OverwriteSlice = true
  meshloc = element
  CouplingConfig = none
  dbug = true

In fd_ufs.yaml I have

     #
     - standard_name: So_u
       alias: ocn_current_zonal
       canonical_units: m s-1
       description: ocean export
     - standard_name: So_u
       alias: surface_eastward_sea_water_velocity

     #
     - standard_name: So_v
       alias: ocn_current_merid
       canonical_units: m s-1
       description: ocean export
     - standard_name: So_v
       alias: surface_northward_sea_water_velocity

and in my ww3_shel.nml I have:

&domain_nml
  domain%start = '20180910 000000'
  domain%stop  = '20180918 000000'
/

&input_nml
  input%forcing%winds      = 'C'
  input%forcing%currents   = 'C'
  input%forcing%ice_conc   = 'F'
  input%forcing%ice_param1 = 'F'
  input%forcing%ice_param5 = 'F'
/

&output_type_nml
  type%field%list = 'WND HS FP DP WLV CUR DPT PHS PTP PDIR SXY'
/

&output_date_nml
  date%field%outffile  = '1'
  date%field%start = '20180910 000000'
  date%field%stop  = '20180918 000000'
  date%field%stride    = '3600'
  date%restart2%stride = '43200'
/

... and the simulation crashes. Switched back to ww3_shel.inp and the simulation proceeds. Most likely the *.nml file is incomplete, I'll report back

@uturuncoglu
Copy link
Collaborator Author

@pvelissariou1 please use this configuration and not switch. Just share your run directory that belongs to this run. I'll like to see the point it crashes? Crashes are always our friend :)

@pvelissariou1
Copy link

@uturuncoglu Here is the ww3_shel.nml.log, dates/times after the field dates are all wrong. may be this is the issue.


DOMAIN %  IOSTYP =        1
DOMAIN %  START  = 20180910 000000
DOMAIN %  STOP   = 20180918 000000

INPUT GRID % :  FORCING % WATER_LEVELS   = F            
INPUT GRID % :  FORCING % CURRENTS       = C            
INPUT GRID % :  FORCING % WINDS          = C            
INPUT GRID % :  FORCING % ATM_MOMENTUM   = F            
INPUT GRID % :  FORCING % AIR_DENSITY    = F            
INPUT GRID % :  FORCING % ICE_CONC       = F            
INPUT GRID % :  FORCING % ICE_PARAM1     = F            
INPUT GRID % :  FORCING % ICE_PARAM2     = F            
INPUT GRID % :  FORCING % ICE_PARAM3     = F            
INPUT GRID % :  FORCING % ICE_PARAM4     = F            
INPUT GRID % :  FORCING % ICE_PARAM5     = F            
INPUT GRID % :  FORCING % MUD_DENSITY    = F            
INPUT GRID % :  FORCING % MUD_THICKNESS  = F            
INPUT GRID % :  FORCING % MUD_VISCOSITY  = F            
INPUT GRID % :  ASSIM % MEAN             = F            
INPUT GRID % :  ASSIM % SPEC1D           = F            
INPUT GRID % :  ASSIM % SPEC2D           = F            

OUTPUT TYPE %  FIELD % LIST         = WND HS FP DP WLV CUR DPT PHS PTP PDIR SXY
OUTPUT TYPE %  POINT % FILE         = points.list
OUTPUT TYPE %  TRACK % FORMAT       = T
OUTPUT TYPE %  PARTITION % X0       =        0
OUTPUT TYPE %  PARTITION % XN       =        0
OUTPUT TYPE %  PARTITION % NX       =        0
OUTPUT TYPE %  PARTITION % Y0       =        0
OUTPUT TYPE %  PARTITION % YN       =        0
OUTPUT TYPE %  PARTITION % NY       =        0
OUTPUT TYPE %  PARTITION % FORMAT   = T
OUTPUT TYPE %  RESTART % EXTRA      = unset

OUTPUT DATE MODEL GRID %  FIELD % START        = 20180910 000000
OUTPUT DATE MODEL GRID %  FIELD % STRIDE       = 3600
OUTPUT DATE MODEL GRID %  FIELD % STOP         = 20180918 000000
OUTPUT DATE MODEL GRID %  POINT % START        = 19680606 000000
OUTPUT DATE MODEL GRID %  POINT % STRIDE       = 0
OUTPUT DATE MODEL GRID %  POINT % STOP         = 19680607 000000
OUTPUT DATE MODEL GRID %  TRACK % START        = 19680606 000000
OUTPUT DATE MODEL GRID %  TRACK % STRIDE       = 0
OUTPUT DATE MODEL GRID %  TRACK % STOP         = 19680607 000000
OUTPUT DATE MODEL GRID %  RESTART % START      = 19680606 000000
OUTPUT DATE MODEL GRID %  RESTART % STRIDE     = 0
OUTPUT DATE MODEL GRID %  RESTART % STOP       = 19680607 000000
OUTPUT DATE MODEL GRID %  RESTART2 % START      = 19680606 000000
OUTPUT DATE MODEL GRID %  RESTART2 % STRIDE     = 43200
OUTPUT DATE MODEL GRID %  RESTART2 % STOP       = 19680607 000000
OUTPUT DATE MODEL GRID %  BOUNDARY % START     = 19680606 000000
OUTPUT DATE MODEL GRID %  BOUNDARY % STRIDE    = 0
OUTPUT DATE MODEL GRID %  BOUNDARY % STOP      = 19680607 000000
OUTPUT DATE MODEL GRID %  PARTITION % START    = 19680606 000000
OUTPUT DATE MODEL GRID %  PARTITION % STRIDE   = 0
OUTPUT DATE MODEL GRID %  PARTITION % STOP     = 19680607 000000

HOMOG_COUNT %  N_IC1       =        0
HOMOG_COUNT %  N_IC2       =        0
HOMOG_COUNT %  N_IC3       =        0
HOMOG_COUNT %  N_IC4       =        0
HOMOG_COUNT %  N_IC5       =        0
HOMOG_COUNT %  N_MDN       =        0
HOMOG_COUNT %  N_MTH       =        0
HOMOG_COUNT %  N_MVS       =        0
HOMOG_COUNT %  N_LEV       =        0
HOMOG_COUNT %  N_CUR       =        0
HOMOG_COUNT %  N_WND       =        0
HOMOG_COUNT %  N_ICE       =        0
HOMOG_COUNT %  N_TAU       =        0
HOMOG_COUNT %  N_RHO       =        0
HOMOG_COUNT %  N_MOV       =        0

I will continue the current run. Will submit again the one that crashes.

@uturuncoglu
Copy link
Collaborator Author

@pvelissariou1 Could be. Please double check the configuration. Once it crash I could look. Maybe I could copy your run directory and look at more carefully.

@pvelissariou1
Copy link

@uturuncoglu I am working on it, I'll report back my findings.

@pvelissariou1
Copy link

@uturuncoglu
When I modify the fd_ufs.yaml per your suggestion:

     #
     - standard_name: So_u
       alias: ocn_current_zonal
       canonical_units: m s-1
       description: ocean export
     - standard_name: So_u
       alias: surface_eastward_sea_water_velocity

     #
     - standard_name: So_v
       alias: ocn_current_merid
       canonical_units: m s-1
       description: ocean export
     - standard_name: So_v
       alias: surface_northward_sea_water_velocity

The simulation crashes due to WW3 crash. I don't understand exactly what So_u and So_v variables do as we define "surface_eastward_sea_water_velocity" and "surface_northward_sea_water_velocity" and this is what WW3 accepts.

@pvelissariou1
Copy link

@uturuncoglu In my ww3_shel.inp file I use:

   C F     Water levels
   C F     Currents
   T F     Winds
   F F     Ice concentrations
   F F     Atmospheric momentum
   F F     Air density
   F       Assimilation data : Mean parameters
   F       Assimilation data : 1-D spectra
   F       Assimilation data : 2-D spectra

but I get in the *.ESMF_LogFile files the following "not connected" messages:

20240213 052442.208 INFO             PET163 adc_cap.F90:878 (adc_cap:ADCIRC_RealizeFields)Adcirc export Field sea_surface_height_above_sea_level                               is not connected.
20240213 052442.208 INFO             PET163 adc_cap.F90:878 (adc_cap:ADCIRC_RealizeFields)Adcirc export Field surface_eastward_sea_water_velocity                              is not connected.
20240213 052442.208 INFO             PET163 adc_cap.F90:878 (adc_cap:ADCIRC_RealizeFields)Adcirc export Field surface_northward_sea_water_velocity                             is not connected.

@uturuncoglu
Copy link
Collaborator Author

@pvelissariou1 If you could copy the run directory to Orion, I could try to run and fix in my side. We are adding those aliases to the field dictionary since the field names are not matching and it is not connected by the mediator. Anyway, I could try to debug once I have the run directory. Is this DATM+ADCIRC+WW3? If you could send me your compile command along with the run directory that would be great.

@pvelissariou1
Copy link

@uturuncoglu I am building all PAHM, ADCIRC, WW3 to account for all different configurations. The command I am using to compile on hera in the test directory is:
compile.sh hera "-DAPP=CSTLPAW -DADCIRC_CONFIG=PADCIRC -DCOUPLED=ON -DBUILD_ADCPREP=ON -DPDLIB=ON -DBUILD_TOOLS=ON" coastalPAW intel YES NO
I'll put the folder with the simulation that crashes on Orion shortly

@pvelissariou1
Copy link

@uturuncoglu The run folder is at:
/work/noaa/nosofs/pvelissa/for_ufuk_checking/florence_atm2adc2ww3_250m/run.new.inp_fd_mod

@uturuncoglu
Copy link
Collaborator Author

@pvelissariou1 Did you make any change in the ADCIRC build? Mine is not building since ADCIRC and WW3 has same module name for constants. So, I am getting error like following,

ADCIRC-interface/libadcirc.a(constants.F.o): In function `constants._':
constants.F:(.text+0x0): multiple definition of `constants._'
lib/libww3.a(constants.F90.o):/work/noaa/nems/tufuk/COASTAL/ufs-coastal/WW3/model/src/constants.F90:20: first defined here

I could fix it but let me know if you have any changes in your end. BTW, I could also sync ADCIRC along with this but not sure we need to sync it or not.

@pvelissariou1
Copy link

@uturuncoglu If I remember correctly I hda created a PR to ADCIRC and they have merged the changes.

@pvelissariou1
Copy link

I created a PR (pending review/approval by Zach) for the NUOPC (thirdparty/nuopc) part only. I will submit another PR for the "adcprep" part. After that, and upon successful RT results, latest ADCIRC is to be merged in UFS-Coastal.

@janahaddad
Copy link
Collaborator

PR #2 pending review

@pvelissariou1
Copy link

@ufuk and @Jana, I have already created a PR with these changes to oceanmodeling/adcirc as well. Ufuk please review and merge these in the adcirc fork. Tested and compiles without issues that is, ADCIRC+WW3

@pvelissariou1
Copy link

@jan, this is the PR. From the ADCIRC side upstream, Zach waits Dam's response on some PR details before merging. When the PR is merged in the ADCIRC repo we can sync the fork or we can point instead UFS-Coastal to ADCIRC repo.

@janahaddad
Copy link
Collaborator

@pvelissariou1 Can you also link the PR for the NUOPC part, as I did for your oceanmodeling/adcirc PR? On the page for this issue there is a section called "Development" where you can add that PR from the adcirc/adcirc repo ... I can't do it since I don't have permissions on the adcirc/adcirc repo.

if you link your PRs to this issue, then when the PRs are approved and merged, this issue will automatically close. If issues remain after that, you can create a new issue in this repo that is more specific

@pvelissariou1
Copy link

@janahaddad This is one PR altogether, I just did it in two stages

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Needs review
Development

No branches or pull requests

4 participants