From a71f056acfd0b0e3ded09bcbcb0de365b5a9ec92 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Fri, 11 Oct 2024 15:52:40 -0600 Subject: [PATCH] Update develop-ref after dtcenter/MET#2988 (#2992) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * 2673 Moved dvariable declaration after include * #2673 Move down namespace below include * Feature #2395 wdir (#2820) * Per #2395, add new columns to VL1L2, VAL1L2, and VCNT line types for wind direction statistics. Work still in progress. * Per #2395, write the new VCNT columns to the output and document the additions to the VL1L2, VAL1L2, and VCNT columns. * Per #2395, add the definition of new statistics to Appendix G. * Per #2395, update file version history. * Per #2395, tweak warning message about zero wind vectors and update grid-stat and point-stat to log calls to the do_vl1l2() function. * Per #2395, refine the weights for wind direction stats, ignoring the undefined directions. * Update src/tools/core/stat_analysis/aggr_stat_line.cc * Update src/tools/core/stat_analysis/parse_stat_line.cc * Update src/tools/core/stat_analysis/aggr_stat_line.cc * Recent changes to branch protection rules for the develop branch have broken the logic of the update_truth.yml GHA workflow. Instead of submitting a PR to merge develop into develop-ref directly, use an intermediate update_truth_for_develop branch. * Feature #2280 ens_prob (#2823) * Per #2280, update to support probability threshold strings like ==8, where 8 is the number of ensemble members, to create probability bins centered on the n/8 for n = 0 ... 8. * Per #2280, update docs about probability threshold settings. * Per #2280, use a loose tolerance when checking for consistent bin widths. * Per #2280, add a new unit test for grid_stat to demonstrate processing the output from gen_ens_prod. * Per #2280, when verifying NMEP probability forecasts, smooth the obs data first. * Per #2280, only request STAT output for the PCT line type to match unit_grid_stat.xml and minimize the new output files. * Per #2280, update config option docs. * Per #2280, update config option docs. * #2673 Change 0 to nullptr * #2673 Change 0 to nullptr * #2673 Change 0 to nullptr * #2673 Change 0 to nullptr * #2673 Change 0 to nullptr * #2673 Removed the redundant parentheses with return * #2673 Removed the redundant parentheses with return * #2673 Removed the redundant parentheses with return * #2673 Removed the redundant parentheses with return * #2673 Removed the redundant parentheses with return * #2673 restored return statement * #2673 Added std namespace * #2673 Moved down 'using namespace' statement. Removed trailing spaces * #2673 Moved down 'using namespace' statement. * #2673 Moved down 'using namespace' statement. * #2673 Moved down 'using namespace' statement. * #2673 Moved down 'using namespace' statement. * #2673 Added std namespace * #2673 Added std namespace * #2673 Added std namespace * #2673 Changed literal 1 to boolean value, true * Feature #2673 enum_to_string (#2835) * Feature #2583 ecnt (#2825) * Unrelated to #2583, fix typo in code comments. * Per #2583, add hooks write 3 new ECNT columns for observation error data. * Per #2583, make error messages about mis-matched array lengths more informative. * Per #2583, switch to more concise variable naming conventions of ign_oerr_cnv, ign_oerr_cor, and dawid_seb. * Per #2583, fix typo to enable compilation * Per #2583, define the 5 new ECNT column names. * Per #2583, add 5 new columns to the ECNT table in the Ensemble-Stat chapter * Per #2583, update stat_columns.cc to write these 5 new ECNT columns * Per #2583, update ECNTInfo class to compute the 5 new ECNT statistics. * Per #2583, update stat-analysis to parse the 5 new ECNT columns. * Per #2583, update aggregate_stat logic for 5 new ECNT columns. * Per #2583, update PairDataEnsemble logic for 5 new ECNT columns * Per #2583, update vx_statistics library with obs_error handling logic for the 5 new ECNT columns * Per #2583, changes to make it compile * Per #2583, changes to make it compile * Per #2583, switch to a consistent ECNT column naming convention with OERR at the end. Using IGN_CONV_OERR and IGN_CORR_OERR. * Per #2583, define ObsErrorEntry::variance() with a call to the dist_var() utility function. * Per #2583, update PairDataEnsemble::compute_pair_vals() to compute the 5 new stats with the correct inputs. * Per #2583, add DEBUG(10) log messages about computing these new stats. * Per #2583, update Stat-Analysis to compute these 5 new stats from the ORANK line type. * Per #2583, whitespace and comments. * Per #2583, update the User's Guide. * Per #2583, remove the DS_ADD_OERR and DS_MULT_OERR ECNT columns and rename DS_OERR as DSS, since observation error is not actually involved in its computation. * Per #2583, minor update to Appendix C * Per #2583, rename ECNT line type statistic DSS to IDSS. * Per #2583, fix a couple of typos * Per #2583, more error checking. * Per #2583, remove the ECNT IDSS column since its just 2*pi*IGN, the existing ignorance score, and only provides meaningful information when combined with the other Dawid-Sebastiani statistics that have already been removed. * Per #2583, add Eric's documentation of these new stats to Appendix C. Along the way, update the DOI links in the references based on this APA style guide: https://apastyle.apa.org/style-grammar-guidelines/references/dois-urls#:~:text=Include%20a%20DOI%20for%20all,URL%2C%20include%20only%20the%20DOI. * Per #2583, fix new equations with embedded underscores for PDF by defining both html and pdf formatting options. * Per #2583, update the ign_conv_oerr equation to include a 2 *pi multiplier for consistency with the existing ignorance score. Also, fix the documented equations. * Per #2583, remove log file that was inadvertently added on this branch. * Per #2583, simplify ObsErrorEntry::variance() implementation. For the distribution type of NONE, return a variance of 0.0 rather than bad data, as discussed with @michelleharrold and @JeffBeck-NOAA on 3/8/2024. --------- Co-authored-by: MET Tools Test Account * Revert #2825 since more documentation and testing is needed (#2837) This reverts commit 108a8958b206d6712197823a083666ab039bf818. * Feature #2583 ecnt fix IGN_OERR_CORR (#2838) * Unrelated to #2583, fix typo in code comments. * Per #2583, add hooks write 3 new ECNT columns for observation error data. * Per #2583, make error messages about mis-matched array lengths more informative. * Per #2583, switch to more concise variable naming conventions of ign_oerr_cnv, ign_oerr_cor, and dawid_seb. * Per #2583, fix typo to enable compilation * Per #2583, define the 5 new ECNT column names. * Per #2583, add 5 new columns to the ECNT table in the Ensemble-Stat chapter * Per #2583, update stat_columns.cc to write these 5 new ECNT columns * Per #2583, update ECNTInfo class to compute the 5 new ECNT statistics. * Per #2583, update stat-analysis to parse the 5 new ECNT columns. * Per #2583, update aggregate_stat logic for 5 new ECNT columns. * Per #2583, update PairDataEnsemble logic for 5 new ECNT columns * Per #2583, update vx_statistics library with obs_error handling logic for the 5 new ECNT columns * Per #2583, changes to make it compile * Per #2583, changes to make it compile * Per #2583, switch to a consistent ECNT column naming convention with OERR at the end. Using IGN_CONV_OERR and IGN_CORR_OERR. * Per #2583, define ObsErrorEntry::variance() with a call to the dist_var() utility function. * Per #2583, update PairDataEnsemble::compute_pair_vals() to compute the 5 new stats with the correct inputs. * Per #2583, add DEBUG(10) log messages about computing these new stats. * Per #2583, update Stat-Analysis to compute these 5 new stats from the ORANK line type. * Per #2583, whitespace and comments. * Per #2583, update the User's Guide. * Per #2583, remove the DS_ADD_OERR and DS_MULT_OERR ECNT columns and rename DS_OERR as DSS, since observation error is not actually involved in its computation. * Per #2583, minor update to Appendix C * Per #2583, rename ECNT line type statistic DSS to IDSS. * Per #2583, fix a couple of typos * Per #2583, more error checking. * Per #2583, remove the ECNT IDSS column since its just 2*pi*IGN, the existing ignorance score, and only provides meaningful information when combined with the other Dawid-Sebastiani statistics that have already been removed. * Per #2583, add Eric's documentation of these new stats to Appendix C. Along the way, update the DOI links in the references based on this APA style guide: https://apastyle.apa.org/style-grammar-guidelines/references/dois-urls#:~:text=Include%20a%20DOI%20for%20all,URL%2C%20include%20only%20the%20DOI. * Per #2583, fix new equations with embedded underscores for PDF by defining both html and pdf formatting options. * Per #2583, update the ign_conv_oerr equation to include a 2 *pi multiplier for consistency with the existing ignorance score. Also, fix the documented equations. * Per #2583, remove log file that was inadvertently added on this branch. * Per #2583, simplify ObsErrorEntry::variance() implementation. For the distribution type of NONE, return a variance of 0.0 rather than bad data, as discussed with @michelleharrold and @JeffBeck-NOAA on 3/8/2024. * Per #2583, updates to ensemble-stat.rst recommended by @michelleharrold and @JeffBeck-NOAA. * Per #2583, implement changes to the IGN_CORR_OERR corrected as directed by @ericgilleland. --------- Co-authored-by: MET Tools Test Account * Update the pull request template to include a question about expected impacts to existing METplus Use Cases. * #2830 Changed enum Builtin to enum class * #2830 Converted enum to enum class at config_constants.h * Feature #2830 bootstrap enum (#2843) * Bugfix #2833 develop azimuth (#2840) * Per #2833, fix n-1 bug when defining the azimuth delta for range/azimuth grids. * Per #2833, when definng TcrmwData:range_max_km, divide by n_range - 1 since the range values start at 0. * Per #2833, remove max_range_km from the TC-RMW config file. Set the default rmw_scale to NA so that its not used by default. And update the documentation. Still actually need to make the logic of the code work as it should. * Per #2833, update tc_rmw to define the range as either a function of rmw or using explicit spacing in km. * Per #2833, update the TCRMW Config files to remove the max_range_km entry, and update the unit test for one call to use RMW ranges and the other to use ranges defined in kilometers. * Per #2833, just correct code comments. * Per #2833, divide by n - 1 when computing the range delta, rather than n. * Per #2833, correct the handling of the maximum range in the tc-rmw tool. For fixed delta km, need to define the max range when setting up the grid at the beginning. --------- Co-authored-by: MET Tools Test Account * #2830 Changed enum PadSize to enum class * #2830 Removed redundant parantheses * #2830 Removed commenyted out code * #2830 Use auto * #2830 Changed enum to enum class for DistType, InterpMthd, GridTemplates, and NormalizeType * #2830 Moved enum_class_as_integer from header file to cc files * #2830 Added enum_as_int.hpp * #2830 Added enum_as_int.hpp * Deleted enum_class_as_integer and renamed it to enum_class_as_int * Removed redundant paranthese * #2830 Changed enum to enumclass * #2830 Changed enum_class_as_integer to enum_class_as_int * Feature #2379 sonarqube gha (#2847) * Per #2379, testing initial GHA SonarQube setup. * Per #2379, switch to only analyzing the src directory. * Per #2379, move more config logic from sonar-project.properties into the workflow. #ci-skip-all * Per #2379, try removing + symbols * Per #2379, move projectKey into xml workflow and remove sonar-project.properties. * Per #2379, try following the instructions at https://github.com/sonarsource-cfamily-examples/linux-autotools-gh-actions-sq/blob/main/.github/workflows/build.yml ci-skip-all * Per #2379, see details of progress described in this issue comment: https://github.com/dtcenter/MET/issues/2379#issuecomment-2000242425 * Unrelated to #2379, just removing spurious space that gets flagged as a diff when re-running enum_to_string on seneca. * Per #2379, try running SonarQube through GitHub. * Per #2379, remove empty env section and also disable the testing workflow temporarily during sonarqube development. * Per #2379, fix docker image name. * Per #2379, delete unneeded script. * Per #2379, update GHA to scan Python code and push to the correct SonarQube projects. * Per #2379, update GHA SonarQube project names * Per #2379, update the build job name * Per #2379, update the comile step name * Per #2379, switch to consistent SONAR variable names. * Per #2379, fix type in sed expressions. * Per #2379, just rename the log artifact * Per #2379, use time_command wrapper instead of run_command. * Per #2379, fix bad env var name * Per #2379, switch from egrep to grep. * Per #2379, just try cat-ting the logfile * Per #2379, test whether cat-ting the log file actually works. * Per #2379, revert back * Per #2379, mention SonarQube in the PR template. Make workflow name more succinct. * Per #2379, add SONAR_REFERENCE_BRANCH setting to define the sonar.newCode.referenceBranch property. The goal is to define the comparison reference branch for each SonarQube scan. * Per #2379, have the sonarqube.yml job print the reference branch it's using * Per #2379, intentionally introduce a new code smell to see if SonarQube correctly flag it as appearing in new code. * Per #2379, trying adding the SonarQube quality gate check. * Per #2379, add logic for using the report-task.txt output files to check the quality gate status for both the python and cxx scans. * Per #2379 must use unique GHA id's * Per #2379, working on syntax for quality gate checks * Per #2379, try again. * Per #2379, try again * Per #2379, try again * Per #2379, try again * Per #2379, try again * Per #2379, try again * Per #2379, try yet again * Per #2379 * Per #2379, add more debug * Per #2379, remove -it option from docker run commands * Per #2379, again * Per #2379, now that the scan works as expected, remove the intentional SonarQube code smell as well as debug logging. * Hotfix related to #2379. The sonar.newCode.referenceBranch and sonar.branch.name cannot be set to the same string! Only add the newCode definition when they differ. * #2830 Changed enum STATJobType to enum class * #2830 Changed STATLineType to enum class * #2830 Changed Action to enum class * #2830 Changed ModeDataType to enum class * #2830 Changed StepCase to enum class * #2830 Changed enum to enum class * #2830 Changed GenesisPairCategory to enum class * #2830 Removed rediundabt parenrthese * #2830 Reduced same if checking * #2830 Cleanup * #2830 USe empty() instead of lebgth checking * #2830 Adjusted indentations * Feature #2379 develop sonarqube updates (#2850) * Per #2379, move rgb2ctable.py into the python utility scripts directory for better organization and to enable convenient SonarQube scanning. * Per #2379, remove point.py from the vx_python3_utils directory which cleary was inadvertenlty added during development 4 years ago. As far as I can tell it isn't being called by any other code and doesn't belong in the repository. Note that scripts/python/met/point.py has the same name but is entirely different. * Per #2379, update the GHA SonarQube scan to do a single one with Python and C++ combined. The nightly build script is still doing 2 separate scans for now. If this all works well, they could also be combined into a single one. * Per #2379, eliminate MET_CONFIG_OPTIONS from the SonarQube workflow since it doesn't need to be and probably shouldn't be configurable. * Per #2379, trying to copy report-task.txt out of the image * Per #2379, update build_met_sonarqube.sh to check the scan return status * Per #2379, fix bash assignment syntax * Per #2379, remove unused SCRIPT_DIR envvar * Per #2379, switch to a single SonarQube scan for MET's nightly build as well * Feature 2654 ascii2nc polar buoy support (#2846) * Added iabp data type, and modified file_handler to filter based on time range, which was added as a command line option * handle time using input year, hour, min, and doy * cleanup and switch to position day of year for time computations * Added an ascii2nc unit test for iabp data * Added utility scripts to pull iabp data from the web and find files in a time range * Modified iabp_handler to always output a placeholder 'location' observation with value 1 * added description of IABP data python utility scripts * Fixed syntax error * Fixed Another syntax error. * Slight reformat of documentation * Per #2654, update the Makefiles in scripts/python/utility to include all the python scripts that should be installed. * Per #2654, remove unused code from get_iabp_from_web.py that is getting flagged as a bug by SonarQube. * Per #2654, fix typo in docs --------- Co-authored-by: John Halley Gotway Co-authored-by: MET Tools Test Account * Feature #2786 rpss_from_prob (#2861) * Per #2786, small change to a an error message unrelated to this development. * Per #2786, add RPSInfo::set_climo_prob() function to derive the RPS line type from climatology probability bins. And update Ensemble-Stat to call it. * Per #2786, minor change to clarify error log message. * Per #2786, for is_prob = TRUE input, the RPS line type is the only output option. Still need to update docs! * Per #2786, add new call to Ensemble-Stat to test computing RPS from climo probabilities * Per #2786, use name rps_climo_bin_prob to be very explicit. * Per #2786, redefine logic of RPSInfo::set_climo_bin_prob() to match the CPC definition. Note that reliability, resolution, uncertainty, and RPSS based on the sample climatology are all set to bad data. Need to investigate whether they can be computed using these inputs. * Per #2786, remove the requirement that any fcst.prob_cat_thresh thresholds must be defined. If they are defined, pass them through to the FCST_THRESH output column. If not, write NA. Add check to make sure the event occurs in exactly 1 category. * Per #2786, don't enforce fcst.prob_cat_thresh == obs.prob_cat_thresh for probabilistic inputs. And add more is_prob checks so that only the RPS line type can be written when given probabilistic inputs. * updated documentation * Per #2786, call rescale_probability() function to convert from 0-100 probs to 0-1 probs. --------- Co-authored-by: j-opatz * Feature #2862 v12.0.0-beta4 (#2864) * Feature #2379 develop single_sq_project (#2865) * Hotfix to the documentation in the develop branch. Issue #2858 was closed as a duplicate of #2857. I had included it in the MET-12.0.0-beta4 release notes, but the work is not yet actually complete. * Feature 2842 ugrid config (#2852) * #2842 Removed UGrid related setting * #2842 Corrected vertical level for data_plane_array * #2842 Do not allow the time range * #2842 The UGridConfig file can be passed as ugrid_dataset * #2842 Changed -config option to -ugrid_config * #2842 Deleted UGrid configurations * 2842 Fix a compile error when UGrid is disabled * #2842 Cleanup * #2842 Added an unittest point_stat_ugrid_mpas_config * #2842 Added a PointStatConfig without UGrid dataset. * #2842 Corrected ty[po at the variable name * Switched from time_centered to time_instant. I think time_centered is the center of the forecast lead window and time_instant is the time the forecast is valid (end of forecast window). * #2842 Removed ugrid_max_distance_km and unused metadata names * #2842 Restored time variable time_instant for LFric * #2842 Adjust lon between -180 and 180 * #2842 Adjust lon between -180 and 180 * #2842 Adjust lon between -180 and 180 * #2842 Adjusted lon to between -180 to 180 * #2842 Changed variable names * Per #2842, switch from degrees east to west right when the longitudes are read. * #2842, switch from degrees east to west right when the longitudes are read * #2842 Cleanup debug messages --------- Co-authored-by: Howard Soh Co-authored-by: Daniel Adriaansen Co-authored-by: John Halley Gotway * Feature 2753 comp script config (#2868) * set dynamic library file extension to .dylib if running on MacOS and .so otherwise * Added disabling of jasper documentation for compiliation on Hera * Updated * remove extra export of compiler env vars * include full path to log file so it is easier to file the log file to examine when a command fails * send cmake output to a log file * remove redundant semi-colon * use full path to log file so it is easier to examine on failure * use run_cmd to catch if rm command fails * Modifications for compilation on hera, gaea, and orion * Updating * fixed variable name * clean up if/else statements * set TIFF_LIBRARY_RELEASE argument to use full path to dynamic library file to prevent failure installing proj library * set LDFLAGS so that LDFLAGS value set in the user's environment will also be used * Updated based on gaea, orion, and hera installs * Updated * change extension of dynamic library files only if architecture is arm64 because older Macs still use .so * added netcdf library to args to prevent error installing NetCDF-CXX when PROJ has been installed in the same run of the script -- PATH is set in the COMPILE_PROJ if block that causes this flag from being added automatically * clean up how rpath and -L are added to LDFLAGS so that each entry is separate -- prevents errors installing on Mac arm64 because multiple rpath values aren't read using :. Also use MET_PROJLIB * Updated * removed -ltiff from MET libs * only add path to rpath and -L arguments if they are not already included in LDFLAGS * changed from using LIB_TIFF (full path to tiff lib file) to use TIFF_LIB_DIR (dir containing tiff lib file). Added TIFF_INCLUDE_DIR to proj compilation and -DJAS_ENABLE_DOC to jasper compliation taken from @jprestop branch * update comments * ensure all MET_* and MET_*LIB variables are added to the rpath for consistency * remove unnecessary if block and only export LDFLAGS at the end of setting locally * Updated * Added section for adding /lib64 and rearranged placement of ADDTL_DIR * Commenting out the running of the Jasper lib tests * Updating and/or removing files * Updating and/or removing files * Latest udpates which include the addition of the tiff library for proj * Remove commented out line. Co-authored-by: John Halley Gotway * Make indentation consistent. Co-authored-by: John Halley Gotway * Make indentation consistent. Co-authored-by: John Halley Gotway * Make indentation consistent. Co-authored-by: John Halley Gotway * Per 2753, added -lm to configure_lib_args for NetCDF-CXX * Per #2753 updating acorn files * Per #2753, update wcoss2 files * Per #2753, updating acorn file to include MET_PYTHON_EXE * Per #2753, updated files for 12.0.0 for derecho * Per #2753, updated derecho file adding MET_PYTHON_EXE and made corrections * Updating config files * Updating orion files * Updates for gaea's files * Updating gaea modulefile * Removing modulefile for cheyenne * Added MET_PYTHON_EXE * Added MET_PYTHON_EXE to hera too * Adding file for hercules * Removing equals sign from setenv * Adding file for hercules * Updated script to add libjpeg installation for grib2c * Per #2753, Adding file for casper --------- Co-authored-by: George McCabe <23407799+georgemccabe@users.noreply.github.com> Co-authored-by: John Halley Gotway * Feature #2795 level_mismatch_warning (#2873) * Per #2795, move the warning message about level mismatch from the config validation step to when the forecast files are being processed. Only check this when the number of forecast fields is greater than 1, but no longer limit the check to pressure levels only. * Per #2795, add comments * Whitespace * Per #2795, port level mismatch fix over to Ensemble-Stat. Check it for each verification task, but only print it once for each task, rather than once for each task * ensemble member. * Feature #2870 removing_MISSING_warning (#2872) * Per #2870, define utility functions for parsing the file type from a file list and for logging missing files, checking for the MISSING keyword. Also, update Ensemble-Stat and Gen-Ens-Prod to call these functions. * Per #2870, update the gen_ens_prod tests to demonstrate the use of the MISSING keyword for missing files. METplus uses this keyword for Ensemble-Stat and Gen-Ens-Prod. * Feature 2842 ugrid config (#2875) * #2842 Removed UGrid related setting * #2842 Corrected vertical level for data_plane_array * #2842 Do not allow the time range * #2842 The UGridConfig file can be passed as ugrid_dataset * #2842 Changed -config option to -ugrid_config * #2842 Deleted UGrid configurations * 2842 Fix a compile error when UGrid is disabled * #2842 Cleanup * #2842 Added an unittest point_stat_ugrid_mpas_config * #2842 Added a PointStatConfig without UGrid dataset. * #2842 Corrected ty[po at the variable name * Switched from time_centered to time_instant. I think time_centered is the center of the forecast lead window and time_instant is the time the forecast is valid (end of forecast window). * #2842 Removed ugrid_max_distance_km and unused metadata names * #2842 Restored time variable time_instant for LFric * #2842 Adjust lon between -180 and 180 * #2842 Adjust lon between -180 and 180 * #2842 Adjust lon between -180 and 180 * #2842 Adjusted lon to between -180 to 180 * #2842 Changed variable names * Per #2842, switch from degrees east to west right when the longitudes are read. * #2842, switch from degrees east to west right when the longitudes are read * #2842 Cleanup debug messages * #2842 Disabled output types except STAT for sl1l2 * #2842 Disabled output types except STAT for sl1l2 and MPR * #2842 Reduced output files for UGrid --------- Co-authored-by: Howard Soh Co-authored-by: Daniel Adriaansen Co-authored-by: John Halley Gotway * Hotfix to develop branch to remove duplicate test named 'point_stat_ugrid_mpas_config'. That was causing unit_ugrid.xml to fail because it was still looking for .txt output files that are no longer being generated. * Feature 2748 document ugrid (#2869) * Initial documentation of the UGRID capability. * Fixes error in references, adds appendix to index, and adds sub-section for configuration entries and a table for metadata map items. * Corrects LFRic, rewords section on UGRID conventions, updates description of using GridStat, and removes mention of nodes. * Forgot one more mention of UGRID conventions. * Incorporates more suggestions from @willmayfield. * Switches to numerical table reference. * Feature #2781 Convert MET NetCDF point obs to Pandas DataFrame (#2877) * Per #2781, added function to convert MET NetCDF point observation data to pandas so it can be read and modified in a python embedding script. Added example python embedding script * ignore python cache files * fixed function call * reduce cognitive complexity to satisfy SonarQube and add boolean return value to catch if function fails to read data * clean up script and add comments * replace call to object function that doesn't exist, handle exception when file passed to script cannot be read by the NetCDF library * rename example script * add new example script to makefiles * fix logic to build pandas DataFrame to properly get header information from observation header IDs * Per #2781, add unit test to demonstrate python embedding script that reads MET NetCDF point observation file and converts it to a pandas DataFrame * Per #2781, added init function for nc_point_obs to take an input filename. Also raise TypeError exception from nc_point_obs.read_data() if input file cannot be read * call parent class init function to properly initialize nc_point_obs * Feature #2833 pcp_combine_missing (#2886) * Per #2883, add -input_thresh command line option to configure allowable missing input files. * Per #2883, update pcp_combine usage statement. * Per #2883, update existing pcp_combine -derive unit test example by adding 3 new missing file inputs at the beginning, middle, and end of the file list. The first two are ignored since they include the MISSING keyword, but the third without that keyword triggers a warning message as desired. The -input_thresh option is added to only require 70% of the input files be present. This should produce the exact same output data. * Per #2883, update the pcp_combine logic for the sum command to allow missing data files based on the -input_thresh threshold. Add a test in unit_pcp_combine.xml to demonstrate. * Update docs/Users_Guide/reformat_grid.rst Co-authored-by: George McCabe <23407799+georgemccabe@users.noreply.github.com> * Per #2883, update pcp_combine usage statement in the code to be more simliar to the User's Guide. * Per #2883, switch to using derive_file_list_missing as the one containing missing files and recreate derive_file_list as it had existed for the test named pcp_combine_derive_VLD_THRESH. * Per #2883, move initialization inside the same loop to resolve SonarQube issues. * Per #2883, update sum_data_files() to switch from allocating memory to using STL vectors to satisfy SonarQube. * Per #2883, changes to declarations of variables to satisfy SonarQube. * Per #2883, address more SonarQube issues * Per #2883, backing out an unintended change I made to tcrmw_grid.cc. This change belongs on a different branch. * Per #2883, update logic of parse_file_list_type() function to handle python input strings. Also update pcp_combine to parse the type of input files being read and log non-missing python input files expected. --------- Co-authored-by: George McCabe <23407799+georgemccabe@users.noreply.github.com> * Per #2888, update STATAnalysisJob::dump_stat_line() to support dumping stat line types VCNT, RPS, DMAP, and SSIDX. (#2891) * Per #2659, making updates as proposed at the 20240516 MET Eng. Mtg. (#2895) * Feature #2395 TOTAL_DIR (#2892) * Per #2395, remove the n_dir_undef and n_dira_undef variables that are superceded by the new dcount and dacount VL1L2Info members to keep track of the number of valid wind direction vectors. * Per #2395, add TOTAL_DIR columns to the VL1L2, VAL1L2, and VCNT line types and update the header column tables. * Per #2395, update the User's Guide to list the new TOTAL_DIR columns in the VL1L2, VAL1L2, and VCNT line types. * Per #2395, update stat_analysis to parse the new TOTAL_DIR columns and use the values to aggregate results when needed. * Per #2395, for SonarQube change 'const char *' to 'const char * const' to satisfy the finding that 'Global variables should be const.' Should probably switch from 'char char *' to strings eventually. But for now, I'm just making up for some SonarQube technical debt. * Per #2395, fix typo in placement of the DIR_ME column name in the met_header_columns_V12.0.txt file * Per #2395, add 2 new Stat-Analysis jobs to demonstrate the processing of VL1L2 lines. * Per #2395, update logic of is_vector_dir_stat(). Instead of just checking 'DIR_', check 'DIR_ME', 'DIR_MAE', and 'DIR_MSE' to avoid an false positive match for the 'DIR_ERR' column which is computed from the vector partial sums rather than the individual direction differences. * Bugfix #2897 develop python_valid_time (#2899) * Per #2897, fix typos in 2 log messages. Also fix the bug in storing the valid time strings. The time string in vld_array should exactly correspond to the numeric unixtime values in vld_num_array. Therefore they need to be updated inside the same if block. The bug is that we were storing only the unique unixtime values but storing ALL of the valid time string, not just the unique ones. * Per #2897, minor change to formatting of log message * MET #2897, don’t waste time searching, just set the index to n - 1 * Per #2897, remove unused add_prec_point_obs(...) function * Per #2897, update add_point_obs(...) logic for DEBUG(9) to print very detailed log messages about what obs are being rejected and which are being used for each verification task. * Per #2897, refine the 'using' log message to make the wording consistent with the summary rejection reason counts log message * Per #2897, update the User's Guide about -v 9 for Point-Stat --------- Co-authored-by: j-opatz Co-authored-by: MET Tools Test Account * Bugfix 2867 point2grid qc flag (#2890) * #2867 Added compute_adp_qc_flag and adjusted ADP QC flags * #2867 Added point2grid_GOES_16_ADP_Enterprise_high. Changed AOD QC flags to 0,1,2 (was 1,2,3) * #2867 Added get_nc_att_values_ * #2867 Added get_nc_att_values. Added the argument allow_conversion to get_nc_data(netCDF::NcVar *, uchar *data) * #2867 Read the ADP QC flag values and meanings attributes from DQF variable and set the QC high, meduium, low values to support Enterprise algorithm. Adjusted the ADP QC values by using AOD qc values * #2867 Cleanup * #2867 Corrected indent * #2867 Changed log message * #2867 Removed unused argument * #2867 Removed unused argument * Cleanup * #2867 Fix SonarQube findings * #2867 Deleted protected section with no members * #2867 Cleanup * #2867 FIxed SonarQube findings; unused local variables, decalare as const, etc * #2867 MOved include directives to top * #2867 Changed some argumenmt with references to avoid copying objects * #2867 Do not filter by QC flag if -qc is not given * #2867 Use enumj class for GOES QC: HIGH, MEDIUM, and LOW * #2867 Added log message back which were deleted accidently * #2867 Chaned statci const to constexpr * #2867 Initial release. Separated from nc_utils.h * @2867 Added nc_utils_core.h * #2867 Moved some blocks to nc_utils_core.h * #2867 Include nc_utils_core.h * #2867 Added const references * Per #2867, fixing typo in comments. --------- Co-authored-by: Howard Soh Co-authored-by: j-opatz * Hotfix to develop to fix the update_truth.yml workflow logic. This testing workflow run failed (https://github.com/dtcenter/MET/actions/runs/9209471209). Here we switch to a unique update truth branch name to avoid conflicts. * Avoid pushing directly to the develop or main_vX.Y branches since that is not necessary for the automation logic in MET. * #2904 Changed R path to R-4.4.0 (#2905) Co-authored-by: Howard Soh * Feature #2912 pb2nc error (#2914) * Feature 2717 convert unit.pl to unit.py (#2871) * created unit.py module in new internal/test_unit/python directory * added xml parsing to unit.py * added repl_env function * added reading of the remaining xml tags in build_tests function * progress on main function (putting together test commands) * a few more lines in the main function * minor updates * fixed how the test command was being run * added if name/main and command line parsing * fixed handling of no 'env' in cmd_only mode * handle params from xml that have \ after filename without space in between * added logging * added some more pieces to unit * more updates to unit.py, including running checks on output files * bug fixes, improved handling of output file names, improved handling of env vars, improved logging output * fixed how shell commands are run, and other minor fixes * added last bits from the perl script, fixed some bugs * created unit.py module in new internal/test_unit/python directory * added xml parsing to unit.py * added repl_env function * added reading of the remaining xml tags in build_tests function * progress on main function (putting together test commands) * a few more lines in the main function * minor updates * update scripts to call python unit test script instead of the old perl script * fix she-bang line to allow script to be run without python3 before it * add missing test_dir and exit_on_fail tags that are found in the rest of the unit test xml files * fix call to logger.warning * change tags named 'exists' to 'exist' to match the rest of the xml files * added logger to function * removed tab at end of line that was causing output file path to be excluded from the command * fix broken checks for output files * incorporated george's recommended changes * changed default to overwrite logs; allow for more than one xml file to be passed in command --------- Co-authored-by: Natalie babij Co-authored-by: Natalie babij Co-authored-by: Natalie babij Co-authored-by: Natalie Babij Co-authored-by: John Halley Gotway Co-authored-by: George McCabe <23407799+georgemccabe@users.noreply.github.com> Co-authored-by: j-opatz * Bugfix 2867 point2grid qc unittest (#2913) * #2867 Added compute_adp_qc_flag and adjusted ADP QC flags * #2867 Added point2grid_GOES_16_ADP_Enterprise_high. Changed AOD QC flags to 0,1,2 (was 1,2,3) * #2867 Added get_nc_att_values_ * #2867 Added get_nc_att_values. Added the argument allow_conversion to get_nc_data(netCDF::NcVar *, uchar *data) * #2867 Read the ADP QC flag values and meanings attributes from DQF variable and set the QC high, meduium, low values to support Enterprise algorithm. Adjusted the ADP QC values by using AOD qc values * #2867 Cleanup * #2867 Corrected indent * #2867 Changed log message * #2867 Removed unused argument * #2867 Removed unused argument * Cleanup * #2867 Fix SonarQube findings * #2867 Deleted protected section with no members * #2867 Cleanup * #2867 FIxed SonarQube findings; unused local variables, decalare as const, etc * #2867 MOved include directives to top * #2867 Changed some argumenmt with references to avoid copying objects * #2867 Do not filter by QC flag if -qc is not given * #2867 Use enumj class for GOES QC: HIGH, MEDIUM, and LOW * #2867 Added log message back which were deleted accidently * #2867 Chaned statci const to constexpr * #2867 Initial release. Separated from nc_utils.h * @2867 Added nc_utils_core.h * #2867 Moved some blocks to nc_utils_core.h * #2867 Include nc_utils_core.h * #2867 Added const references * #2867 Some 'static const' were chnaged to constexpr * #2867 Changed -qc options (1,2,3 to 0,1 - high & medium) for AOD * #2867 Merged develop branch * #2867 Corrected the unit test name --------- Co-authored-by: Howard Soh * Feature #2911 tc_stat_set_hdr (#2916) * Per #2911, no real changes for Stat-Analysis. Just changing order of variables for consistency. * Per #2911, add StatHdrColumns::apply_set_hdr_opts(...) function to be used by TC-Stat. * Per #2911, move ByColumn to the TCStatJob base class and add HdrName and HdrValue to support the -set_hdr job command. * Per #2911, update GSI tools to call the newly added StatHdrColumns::apply_set_hdr_opts(...) function. * Per #2911, update logic of Stat-Analysis for consistency to make use of common apply_set_hdr_opts() function. * Per #2911, add DataLine::set_item() function to support -set_hdr options. * Per #2911, just update contents of error message * Per #2911, add TCStatLine member functions for has() and get_offset(). * Per #2911, update tc_stat to support applying -set_hdr to TC-Stat filter jobs. * Per #2911, revise TC-Stat config files to exercise the -set_hdr job command option * Per #2911, update TC-Stat documentation to mention the -set_hdr job command option * Per #2911, add note * Per #2911, as recommended by SonarQube, make some of these member functions const. * Bugfix #2856 develop ens_climo (#2918) * Per #2856, port over fixes from main_v11.1 to develop. * Per #2856, correct conditionals in set_job_controls.sh and tweak existing Ensemble-Stat configuration file to exercise the logic that's being impacted here. * Bugfix #2841 develop tang_rad_winds (#2921) * Per #2841, port over fixes from bugfix_2841_main_v11.1_tang_rad_winds for the develop branch * Per #2841, clarify in the docs that azimuths are defined in degrees counter-clockwise from due east. * Per #2841, just updating with output from enum_to_string. * Per #2841, tweak the documentation. * Per #2841, correct the location of using namespace lines. * Per #2841, update compute_tc_diag.py to no longer skip writing the radial and tangential wind diagnostics. * Per #2841, update compute_tc_diag.py to no longer skip writing radial and tangential wind diagnostics. * Revert "Per #2841, update compute_tc_diag.py to no longer skip writing radial and tangential wind diagnostics." This reverts commit f097345bedcfcca663e8fb4322eed5b5e00e19fd. * Revert "Per #2841, update compute_tc_diag.py to no longer skip writing the radial and tangential wind diagnostics." This reverts commit c0402151b038c59efab99c060cc5c390edf002f6. * Per #2841, update comp_dir.sh logic to include .dat in the files that are diffed * Replace tab with spaces * Per #2841, correct the units for the azimuth netcdf output variable * Per #2841, reverse the x dimension of the rotated latlon grid to effectively switch from counterclockwise rotation to clockwise. --------- Co-authored-by: MET Tools Test Account * Feature #2601 seeps climo config (#2927) * #2601 Added seeps_grid_climo_name and seeps_point_climo_name * #2601 Added seeps_grid_climo_name * #2601 Removed SEEPS settings * #2601 Initial release * #2601 Changed to set the SEEPS climo by using the configuration * #2601 Removed SEESP settings at PointStatConfig_APCP and use PointStatConfig_SEEPS for SEEPSm testing * #2601 Updated descryption for seeps_grid_climo_name * #2601 Added a argument for the SEEPS clomo file * #2601 Added conf_key_seeps_grid_climo_name and conf_key_seeps_point_climo_name * #2601 Support the climo filename from the configuration * #2601 Corrected key for climo name * Removing duplicate word --------- Co-authored-by: Howard Soh Co-authored-by: Julie Prestopnik * Feature 2673 sonarqube beta5 redundant parentheses (#2930) * #2673 Removed redundant_parentheses * #2673 Removed redundant_parentheses * #2673 Removed redundant parentheses * #2673 Removed redundant parentheses --------- Co-authored-by: Howard Soh * Fix release checksum action (#2929) * Feature 2857 tripolar coordinates (#2928) * #2857 Added MetNcCFDataFile::build_grid_from_lat_lon_vars * #2857 Added NcCfFile::build_grid_from_lat_lon_vars * #2857 Check the coordinates attribute to find latitude, longitude, and time variables * #2857 Get the lat/lon variables from coordinates attribute if exists * #2857 Added two constants * #2857 Deleted debug messages * #2857 Added lat_vname and lon_vname for var_name_map * #2857 Added two unit tests: point2grid_sea_ice_tripolar and point2grid_sea_ice_tripolar_config * #2857 Initial release * #2857 Correct dictinary to get file_type * #2857 DO not check the time variable for point2grid * #2857 Added point2grid_tripolar_rtofs --------- Co-authored-by: Howard Soh * Feature 2932 v12.0.0-beta5 (#2933) * Per #2932, updating version and release notes * Per #2932, updating date on release notes * Per #2932, fixed formatting and links * Update release-notes.rst * Update release-notes.rst Removing inline backticks since they do not format the way I expected, especially when put inside bolded release notes. --------- Co-authored-by: John Halley Gotway * Feature fix release notes (#2934) * Fixing up release notes * Update release-notes.rst --------- Co-authored-by: John Halley Gotway * Per dtcenter/METplus#2643 discussion, add more detail about the budget interpolation method. * Feature #2924 fcst climo, PR 1 of 2 (#2939) * Per #2924, Update the MPR and ORANK output line types to just write duplicate existing climo values, update the header tables and MPR/ORANK documentation tables. * Per #2924, update get_n_orank_columns() logic * Per #2924, update the Stat-Analysis parsing logic to parse the new MPR and ORANK climatology columns. * Per #2924, making some changes to the vx_statistics library to store climo data... but more work to come. Committing this first set of changes that are incomplete but do compile. * Per #2924, this big set of changes does compile but make test produces a segfault for ensemble-stat * Per #2924, fix return value for is_keeper_obs() * Per #2924, move fcst_info/obs_info into the VxPairBase pointer. * Per #2924, update Ensemble-Stat to set the VxPairBase::fcst_info pointer * Per #2924 udpate handling of fcst_info and obs_info pointers in Ensemble-Stat * Per #2924, update the GSI tools to handle the new fcst climo columns. * Per #2924, add backward compatibility logic so that when old climo column names are requested, the new ones are used. * Per #2924, print a DEBUG(2) log message if old column names are used. * Per #2924, switch the unit tests to reference the updated MPR column names rather than the old ones. * Per #2924, working progress. Not fully compiling yet * Per #2924, another round of changes. Removing MPR:FCST_CLIMO_CDF output column. This compiles but not sure if it actually runs yet * Per #2924, work in progress * Per #2924, work in progress. Almost compiling again. * Per #2924, get it compiling * Per #2924, add back in support for SCP and CDP which are interpreted as SOCP and OCDP, resp * Per #2924, update docs about SCP and CDP threshold types * Per #2924, minor whitespace changes * Per #2924, fix an uninitialized pointer bug by defining/calling SeepsClimoGrid::init_from_scratch() member function. The constructor had been calling clear() to delete pointers that weren't properly initialized to nullptr. Also, simplify some map processing logic. * Per #2924, rename SeepsAggScore from seeps to seeps_agg for clarity and to avoid conflicts in member function implementations. * Per #2924, fix seeps compilation error in Point-Stat * Per #2924, fix bug in the boolean logic for handling the do_climo_cdp NetCDF output option. * Per #2924, add missing exit statement. * Per #2924, tweak threshold.h * Per #2924, define one perc_thresh_info entry for each enumerated PercThreshType value * Per #2924, simplify the logic for handling percentile threshold types and print a log message once when the old versions are still used. * Per #2924, update the string comparison return value logic * Per #2924, fix the perc thresh string parsing logic by calling ConcatString::startswith() * Per #2924, switch all instances of CDP to OCDP. Gen-Ens-Prod was writing NetCDF files with OCDP in the output variable names, but Grid-Stat was requesting that the wrong variable name be read. So the unit tests failed. * Per #2924, add more doc details * Per #2924, update default config file to indicate when climo_mean and climo_stdev can be set seperately in the fcst and obs dictionaries. * Per #2924, update the MET tools to parse climo_mean and climo_stdev separately from the fcst and obs dictionaries. * Per #2924, backing out new/modified columns to minimize reg test diffs * Per #2924, one more section to be commented out later. * Per #2924, replace several calls to strncmp() with ConcatString::startswith() to simplify the code * Per #2924, strip out some more references to OBS_CLIMO_... in the unit tests. * Per #2924, delete accidental file * Per #2924 fix broken XML comments * Per #2924, fix comments * Per #2924, address SonarQube findings * Per #2924, tweak a Point-Stat and Grid-Stat unit test config file to make the output more comparable to develop. * Per #2924, fix bug in the logic of PairDataPoint and PairDataEnsemble, when looping over the 3-dim array do not return when checking the climo and fcst values. Instead we need to continue to the next loop iteration. * Per #2924, address more SonarQube code smells to reduce the overall number in MET for this PR. * Per #2924, correct the logic for parsing climo data from MPR lines. * Per #2924, cleanup grid_stat.cc source code by making calls to DataPlane::is_empty() and Grid::nxy(). * Per #2924, remove unneeded ==0 * Hotfix to the develop branch for a copy/paste bug introduced by PR #2939 * Feature #2924 sal1l2_mae, PR 3 of 3 (#2943) * Per #2924, track SL1L2 and SAL1L2 MAE scores with separate variables since they are no longer the same value. I renamed the existing 'mae' as 'smae' and added a new 'samae' variable. Renaming the existing lets me use the compiler help find all references to it throughout the code. * Per #2924, update the User's Guide climatology details and equations. * Per #2924, some changes to aggr_stat_line.cc and series_analysis.cc to satisfy some SonarQube code smells. * Update develop to clarify masking poly options based on METplus Discussion dtcenter/METplus#2650 * Remove two semi-colons that are not actually necessary to avoid confusion. * Per dtcenter/METplus#2653 discussion, update the MTD usage statement to clarify that data specified in the fcst dictionary is read from the -single input files. * Feature #2924 fcst climo, PR 2 of 3 (#2942) * Per #2924, Update the MPR and ORANK output line types to just write duplicate existing climo values, update the header tables and MPR/ORANK documentation tables. * Per #2924, update get_n_orank_columns() logic * Per #2924, update the Stat-Analysis parsing logic to parse the new MPR and ORANK climatology columns. * Per #2924, making some changes to the vx_statistics library to store climo data... but more work to come. Committing this first set of changes that are incomplete but do compile. * Per #2924, this big set of changes does compile but make test produces a segfault for ensemble-stat * Per #2924, fix return value for is_keeper_obs() * Per #2924, move fcst_info/obs_info into the VxPairBase pointer. * Per #2924, update Ensemble-Stat to set the VxPairBase::fcst_info pointer * Per #2924 udpate handling of fcst_info and obs_info pointers in Ensemble-Stat * Per #2924, update the GSI tools to handle the new fcst climo columns. * Per #2924, add backward compatibility logic so that when old climo column names are requested, the new ones are used. * Per #2924, print a DEBUG(2) log message if old column names are used. * Per #2924, switch the unit tests to reference the updated MPR column names rather than the old ones. * Per #2924, working progress. Not fully compiling yet * Per #2924, another round of changes. Removing MPR:FCST_CLIMO_CDF output column. This compiles but not sure if it actually runs yet * Per #2924, work in progress * Per #2924, work in progress. Almost compiling again. * Per #2924, get it compiling * Per #2924, add back in support for SCP and CDP which are interpreted as SOCP and OCDP, resp * Per #2924, update docs about SCP and CDP threshold types * Per #2924, minor whitespace changes * Per #2924, fix an uninitialized pointer bug by defining/calling SeepsClimoGrid::init_from_scratch() member function. The constructor had been calling clear() to delete pointers that weren't properly initialized to nullptr. Also, simplify some map processing logic. * Per #2924, rename SeepsAggScore from seeps to seeps_agg for clarity and to avoid conflicts in member function implementations. * Per #2924, fix seeps compilation error in Point-Stat * Per #2924, fix bug in the boolean logic for handling the do_climo_cdp NetCDF output option. * Per #2924, add missing exit statement. * Per #2924, tweak threshold.h * Per #2924, define one perc_thresh_info entry for each enumerated PercThreshType value * Per #2924, simplify the logic for handling percentile threshold types and print a log message once when the old versions are still used. * Per #2924, update the string comparison return value logic * Per #2924, fix the perc thresh string parsing logic by calling ConcatString::startswith() * Per #2924, switch all instances of CDP to OCDP. Gen-Ens-Prod was writing NetCDF files with OCDP in the output variable names, but Grid-Stat was requesting that the wrong variable name be read. So the unit tests failed. * Per #2924, add more doc details * Per #2924, update default config file to indicate when climo_mean and climo_stdev can be set seperately in the fcst and obs dictionaries. * Per #2924, update the MET tools to parse climo_mean and climo_stdev separately from the fcst and obs dictionaries. * Per #2924, backing out new/modified columns to minimize reg test diffs * Per #2924, one more section to be commented out later. * Per #2924, replace several calls to strncmp() with ConcatString::startswith() to simplify the code * Per #2924, strip out some more references to OBS_CLIMO_... in the unit tests. * Per #2924, delete accidental file * Per #2924 fix broken XML comments * Per #2924, fix comments * Per #2924, address SonarQube findings * Per #2924, tweak a Point-Stat and Grid-Stat unit test config file to make the output more comparable to develop. * Per #2924, fix bug in the logic of PairDataPoint and PairDataEnsemble, when looping over the 3-dim array do not return when checking the climo and fcst values. Instead we need to continue to the next loop iteration. * Per #2924, address more SonarQube code smells to reduce the overall number in MET for this PR. * Per #2924, correct the logic for parsing climo data from MPR lines. * Per #2924, update MPR and ORANK line types to update/add FCST/OBS_CLIMO_MEAN/STDEV/CDF columns. * Per #2924, cleanup grid_stat.cc source code by making calls to DataPlane::is_empty() and Grid::nxy(). * Per #2924, remove unneeded ==0 * Per #2924, working on PR2. * Per #2924, update User's Guide with notional example of specifying climo_mean and climo_stdev separately in the fcst and obs dicts. * Per #2924, adding a new unit test. It does NOT yet run as expected. Will debug on seneca * Per #2924, pass the description string to the read_climo_data_plane*() function to provide better log messages * Per #2924, more work on consistent log messages * Per #2924, tweak the configuration to define both field, climo_mean, and climo_stdev in both the fcst and obs dictionaries * Per #2924, tweak the unit_climatology_mixed.xml test * Per #2924, only whitespace changes. * Per #2924, missed swapping MET #2924 changes in 3 test files * Per #2924, delete accidentally committed file * Per #2924, delete accidentally committed files * Per #2924, add support for GRIB1 time range indicator value of 123 used for the corresponding METplus Use Case. Note that there are 22 other TRI values not currently supported. * Adds caveat regarding longitudes appearing in DEBUG statements with a… (#2947) * Adds caveat regarding longitudes appearing in DEBUG statements with a different sign to the FAQ. * Update appendixA.rst Missing paren * Create install_met_env.cactus * Adding special script for installing beta5 on wcoss2 * Modifying script, including updates to eckit and atlas * Corrected version of bufr being used * Feature #2938 pb2nc_center_time (#2954) * Per #2938, define CRC_Array::add_uniq(...) member function which is now used in PB2NC * Per #2938, replace n_elements() with n() to make the code more concise. Refine log/warning message when multiple message center times are encountered. * Feature #1371 series_analysis (#2951) * Per #1371, add -input command line argument and add support for ALL for the CTC, MCTC, SL1L2, and PCT line types. * Per #1371, rename the -input command line option as -aggregate instead * Per #1371, work in progress * Per #1371, just comments * Per #1371, working on aggregating CTC counts * Per #1371, work in progress * Per #1371, update timing info using time stamps in the aggr file * Per #1371, close the aggregate data file * Per #1371, define set_event() and set_nonevent() member functions * Per #1371, add logic to aggregate MCTC and PCT counts * Merging changes from develop * Per #1371, work in progress aggregating all the line statistics types. Still have several issues to address * Per #1371, switch to using get_stat() functions * Per #1371, work in progress. More consolidation * Per #1371, correct expected output file name * Per #1371, consistent regridding log messages and fix the Series-Analysis PairDataPoint object handling logic. * Per #1371, check the return status when opening the aggregate file. * Per #1371, fix prc/pjc typo * Per #1371, fix the series_analysis PCT aggregation logic and add a test to unit_series_analysis.xml to demonstrate. * Per #1371, resolve a few SonarQube findings * Per #1371, make use of range-based for loop, as recommeded by SonarQube * Per #1371, update series-analysis to apply the valid data threshold properly using the old aggregate data and the new pair data. * Per #1371, update series_analysis to buffer data and write it all at once instead of storing data value by value for each point. * Per #1371, add useful error message when required aggregation variables are not present in the input -aggr file. * Per #1371, print a Debug(2) message listing the aggregation fields being read. * Per #1371, correct operator+= logic in met_stats.cc for SL1L2Info, VL1L2Info, and NBRCNTInfo. The metadata settings, like fthresh and othresh, were not being passed to the output. * Per #1371, the DataPlane for the computed statistics should be initialized to a field of bad data values rather than the default value of 0. Otherwise, 0's are reported for stats a grid points with no data when they should really be reported as bad data! * Per #1371, update logic of the compute_cntinfo() function so that CNT statistics can be derived from a single SL1L2Info object containing both scalar and scalar anomaly partial sums. These changes enable CNT:ANOM_CORR to be aggregated in the Series-Analysis tool. * Per #1371, fix logic of climo log message. * Per #1371, this is actually related to MET #2924. In compute_pctinfo() used obs climo data first, if provided. And if not, use fcst climo data. * Per #1371, fix indexing bug (+i instead of +1) when check the valid data count. Also update the logic of read_aggr_total() to return a count of 0 for bad data. * Per #1371, add logic to aggregate the PSTD BRIERCL and BSS statistics in the do_climo_brier() function. Tested manually to confirm that it works. * Per #1371, switch to using string literals to satisfy SonarQube * Per #1371, update series_analysis tests in unit_climatology_1.0deg.xml to demonstrate aggregating climo-based stats. * Per #1371, remove extra comment * Per #1371, skip writing the PCT THRESH_i columns to the Series-Analysis output since they are not used * Per #1371, fix the R string literals to remove \t and \n escape sequences. * Per #1371, update the read_aggr_data_plane() suggestion strings. * Per #1371, ignore unneeded PCT 'THRESH_' variables both when reading and writing ALL PCT columns. * Per #1371, update the test named series_analysis_AGGR_CMD_LINE to include data for the F42 lead time that had previously been included for the same run in the develop branch. Note however that the timestamps in the output file for the develop branch (2012040900_to_2012041100) were wrong and have been corrected here (2012040900_to_2012041018) to match the actual data. * Per #1371, update the -aggr note to warn users about slow runtimes * Feature 2948 cxx17 (#2953) * Per #2948, updating versions of ecbuild, eckit, and atlas * Per #2948, Adding MET_CXX_STANDARD * Per #2948, updated wording for MET_CXX_STANDARD description * Per #2948, updating script to work with two versions of ecbuild, eckit, and atlas * Per #2948, without this change, there are compilation problems if the user wants to compile wihtout python * Per #2948, fixing logic for MET_CXX_STANDARD * Per #2928, adding missing end bracket * Per #2948, fixed the logic for compiling versions of ecbuild, eckit, and atlas * Per 948, fixed syntax for setting CXXFLAGS * Per #2948, adding new Makefile.in files and configure and changing METbaseimage 3.2 to 3.3. * Per #2948, updating version of met base tag from 3.2 to 3.3 * Per #2948, adding --enable-all MET_CXX_STANDARD=11 job * Update compilation_options.yml * Per #2948, added a job10 for MET_CXX_STANDARD=14 * Per #2948, added brief documentation for the MET_CXX_STANDARD option --------- Co-authored-by: Julie Prestopnik Co-authored-by: John Halley Gotway * Feature 1729 set attr grid (#2955) * #1729 Allow to change to differnt grid size if the raw size is 0 * Added build_grid_by_grid_string and build_grid_by_grid_string * #1729 Calls build_grid_by_grid_string * #1729 Added set_attr_grid at the -field option * #1729 Set obs_type to TYPE_NCCF if the file_type is given at the config file * #1729 Support set_sttr_grid and changed Error messages to Warning * #1729 FIxed SonmarQube findings * #1729 Initial release for unit test * #1729 Added update_missing_values * #1729 Deleted a shadowed local variable * #2673 Added more is_eq * #2673 Added get_exe_duration * 2673 Reducded nested statements * 2673 Fixed SonarGube findings * 2673 Fixed SonarQube findings * 2673 Fixed SonarQube findings * #1729 Added aan unittest plot_data_plane_set_attr_grid * #1729 Added aan unittest point2grid_cice_set_attr_grid * #1729 Added changed back the verbose level * #1729 Corrected typo --------- Co-authored-by: Howard Soh * Bugfix #2958 develop BAGSS SEDI CI (#2959) * Bugfix 2936 point2grid gfs (#2964) * #2936 Support 1D lat/lon values * #2936 Initial release * #2936 Cast the data type to avoid a compile warning * #2936 Added an unittest point2grid_gfs_1D_lat_lon --------- Co-authored-by: Howard Soh * Bugfix 2968 point2grid set attr grid (#2969) * #2968 Corrected set_attr_grid for point2grid_cice_set_attr_grid * #2968 Compare the DataPlane size and the variable data size * #2968 nx and ny are not ignored with set_attr_grid * #2968 Compare the DataPlane size and the variable data size --------- Co-authored-by: Howard Soh * Feature 2937 update unit (#2944) * added single quotes around env var/val pairs in export statements in cmd only mode * updated logic in unit() to check exec return value against expected return value; created TEST xml file to test this feature * deleted TEST_ xml, added test with retval 1 to unit_ascii2nc --------- Co-authored-by: Natalie Babij * Feature #2887 categorical weights PR 1 of 2 (#2967) * Per #2887, update NumArray::vals() to return a reference to the vector rather a pointer to doubles. * Per #2887, switch over the whole ContingencyTable class heirarchy from storing integer counts to storing double-precision weights. * Add ContingencyTable::is_integer() member function to check whether the table contains all integers * Per #2887, update parse_stat_line.cc to get it to compile after changing PCT to store thresholds in a std::vector. * Per #2887, update PCTInfo::clear() logic. * Per #2887, update ctc_by_row() logic to create reproducible results with the develop branch. * Per #2887, update logic of define_prob_bins() to add a final >=1.0 threshold if needed. While ==0.1 works fine, I found that ==0.05 did not because the last >=1.0 threshold was missing likely do to floating point precision issues. This change should fix that problem. * Per #2887, update roc_auc() function to match the develop branch * Per #2887, fix bug if computation of far() * Per #2887, replaced all ==0 integer equality checks with calls to is_eq() instead and fix a couple of equations to snuff out diffs in some CTS statistics. * Per #2887, address some of the 34 SonarQube code smells flagged for this PR. Note that the compute_ci.h/.cc changes are necessary and good since we should be computing CI's using doubles instead of integer counts. * Per #2887, update run_sonarqube.sh to specify the target CXX standard as 11. The hope is that that will limit the findings to only those features available in the C++11 standard. * Per #2887, update to SonarQube version 6.1.0.4477 released on 6/27/2024. * Per #2887, updating build_met_sonarqube.sh to specify --std=c++11 since c++17 is used by default * Hotfix to develop to fix a bug introduced for MET #2887. Refine the define_prob_bins() utility function so that ==n probability thresholds result in the correct number of probability thresholds. We were adding an unncessary 10-th bin (from 1.07143 to 1.0) for the ==7 probability threshold type. * Fix typo in tc-pairs.rst * Update build_docker_and_trigger_metplus.yml The docs directory was moved up to the top-level of the repository but this workflow was not updated. Changing the ignore setting so that doc-only updates do not trigger the full METplus testing workflow. * Feature 2023 remove double quotes around keywords (#2974) * testing AREA and AUTO changes * Keywords B thru L * thru R * adding quotes back in for lower case items * S thru the end of the document * Removing double quotes around 3 key words * Per #2023, adding a label name for the Attributes section * Per #2023, adding an internal link for the MODE tool Attributes section. * Adding quotes around Valid basins entries * more double quote updates * more complex updates with Julie P help * removing double quotes * fixing typos * removing double quotes * unbolding SURFACE and putting it in double quotes * fixing grammar * grammar * fixing typo * fixing typo --------- Co-authored-by: Julie Prestopnik * Feature #2924 parse_config (#2963) * Per #2924, remove GenEnsProd config file comment about parsing desc separately from each obs.field entry because the obs dictionary does not exist in the GenEnsProd config file. * Per #2924, update list of needed config entry names * Per #2924, remove const from the parent() member function so that we can perform lookups for the parent. * Per #2924, update the signature for and logic of the utility functions that retrieve the climatology data. Rather than requiring all the climo_mean and climo_stdev dictionary entries to be defined at the same config file context level, parse each one individually. This enables the METplus wrappers to only partially override this dictionary and still rely on the default values provided in MET's default configuration files. * Per #2924, update all calls to the climatology utility functions based on the new function signature. Also update the tools to check the number of climo fields separately for the forecast and observation climos. * Per #2924, update the parsing logic for the climatology regrid dictionary. Use config.fcst.climo_mean.regrid first, config.fcst.regrid second, and config.climo_mean.regrid third. Notably, DO NOT use config.regrid. This is definitely the problem with having regrid specified at mutliple config file context levels. It makes the logic for which to use when very messy. * Per #2924, forgot to add an else to print an error * Per #2924, remove extraneous semicolon * Per #2924, move 'fcst.regrid' into 'fcst.climo_mean.regrid'. Defining the climatology regridding logic inside fcst is problematic because it applies to the forecast data as well and you end up with the verification grid being undefined. So the climo regridding logic must be defined in 'climo_mean.regrid' either within the 'fcst' and 'obs' dictionaries or at the top-level config context. * Per #2924, based on PR feedback from @georgemccabe, add the Upper_Left, Upper_Right, Lower_Right, and Lower_Left interpolation methods to the list of valid options for regridding, as already indicated in the MET User's Guide. * Per #2924, update the logic of parse_conf_regrid() to (hopefully) make it work the way @georgemccabe expects it to. It now uses pointers to both the primary and default dictionaries and parses each entry individually. * Per #2924, need to check for non-null pointer before using it * Per #2924, revise the climo_name dictionary lookup logic when parsing the regrid dictionary. * Per #2924, update logic for handling RegridInfo * Per #2924, remove the default regridding information from the 'Searching' log message to avoid confusion. --------- Co-authored-by: MET Tools Test Account * Feature #2924 parse_config PR 2 (#2975) * Per #2924, remove GenEnsProd config file comment about parsing desc separately from each obs.field entry because the obs dictionary does not exist in the GenEnsProd config file. * Per #2924, update list of needed config entry names * Per #2924, remove const from the parent() member function so that we can perform lookups for the parent. * Per #2924, update the signature for and logic of the utility functions that retrieve the climatology data. Rather than requiring all the climo_mean and climo_stdev dictionary entries to be defined at the same config file context level, parse each one individually. This enables the METplus wrappers to only partially override this dictionary and still rely on the default values provided in MET's default configuration files. * Per #2924, update all calls to the climatology utility functions based on the new function signature. Also update the tools to check the number of climo fields separately for the forecast and observation climos. * Per #2924, update the parsing logic for the climatology regrid dictionary. Use config.fcst.climo_mean.regrid first, config.fcst.regrid second, and config.climo_mean.regrid third. Notably, DO NOT use config.regrid. This is definitely the problem with having regrid specified at mutliple config file context levels. It makes the logic for which to use when very messy. * Per #2924, forgot to add an else to print an error * Per #2924, remove extraneous semicolon * Per #2924, move 'fcst.regrid' into 'fcst.climo_mean.regrid'. Defining the climatology regridding logic inside fcst is problematic because it applies to the forecast data as well and you end up with the verification grid being undefined. So the climo regridding logic must be defined in 'climo_mean.regrid' either within the 'fcst' and 'obs' dictionaries or at the top-level config context. * Per #2924, based on PR feedback from @georgemccabe, add the Upper_Left, Upper_Right, Lower_Right, and Lower_Left interpolation methods to the list of valid options for regridding, as already indicated in the MET User's Guide. * Per #2924, update the logic of parse_conf_regrid() to (hopefully) make it work the way @georgemccabe expects it to. It now uses pointers to both the primary and default dictionaries and parses each entry individually. * Per #2924, need to check for non-null pointer before using it * Per #2924, revise the climo_name dictionary lookup logic when parsing the regrid dictionary. * Per #2924, update logic for handling RegridInfo * Per #2924, remove the default regridding information from the 'Searching' log message to avoid confusion. * Per #2924, escape sequences, like \n, cannot be used inside R-string literals. * Per #2924, update the logic of check_climo_n_vx() * Per #2924, revise logic in read_climo_data_plane_array(). Check the number of climo fields provided. If there's 0, just return since no data has been requested. If there's 1, use it regardless of the number of input fields. If there's more than 1, just use the requested i_vx index value. * Per #2924, update Series-Analysis to set both i_fcst and i_obs when looping over the series entries. * Per #2924, no real change. Just whitespace. * Unrelated to #2924, superficial changes to formatting of method_name strings for consistency. * Per #2924, add a new series_analysis test that ERRORS OUT prior to this PR but works after the changes in this PR. --------- Co-authored-by: MET Tools Test Account * Feature 2949 cxx11 doc (#2973) * Per #2949, updating installation instructions * Per #2949, adding missing colon from note directive * Per #2949, third attempt to get the new note to show up * Per #2949, modifying text and format * Per #2949, removing images in favor of code blocks for easier modification * Per #2949, modified wording for clarity * Per #2929, corrected typo * Update installation.rst No changes to content, only whitespace for consistency, mostly removing tabs. * Update docs/Users_Guide/installation.rst Co-authored-by: John Halley Gotway * Per #2949, testing variable replacement, expect failures * Per #2949, reverting to orignal state after testing --------- Co-authored-by: John Halley Gotway * Bugfix #2979 develop MTD Grid (#2981) * Per #2979, remove nc_grid.h/.cc and replace it with calls to the read_netcdf_grid(...) and write_netcdf_proj(...) library utility functions. Note that these changes do compile but I haven't tested whether they actually fix the underlying problem. Also note that nc_utils_local.h/.cc can also likely be replaced with calls to common library functions. * Per #2979, remove references to nc_grid.o from the MTD test code. * Per #2979, insert a newline in unit.py output between the env vars and the command. * Per #2979, insert a newline in unit.py output between the env vars and the command. * Per #2979, the write_netcdf_proj(...) utility function adds the lat and lon dimensions. Update mtd to NOT define those dimensions prior to calling write_netcdf_proj(...). * Per #2979, minor changes to is_eq() calls to fix compiler warning messages * Per #2979, for the develop branch, also replace nc_utils_local.h/.cc with calls to common library code. Also remove commented out code. * Per #2979, delete commented out code and make error/warning message formatting consistent. * Fixes for SonarQube --------- Co-authored-by: MET Tools Test Account * Feature #2880 point2grid qc (#2984) * Per #2880, remove Point2Grid quality_mark_thresh config option and add obs_quality_inc and obs_quality_exc from the config file. * Per #2880, update point2grid docs and reformat whitespace throughout. * Per #2880, remove quality_mark_thresh and add obs_quality_inc and obs_quality_exc. * Unrelated to #2880, fix formatting of this R-string which cannot include any special formatting, such as \n. * Per #2880, not working quite right yet but this is progress * Per #2880, add a write_css(IntArray) utility function. * Per #2880, update NcPointObsData class to read the obs quality values from the input file. * Per #2880, update the log message about the quality control filter options applied. * Per #2880, tweak log messages. * Per #2880, tweak log messages. * Per #2880, add a Point2Grid unit test to demonstrate using the obs_quality_inc and obs_quality_exc options. * Per #2880, declare these get accessor functions as const to avoid SonarQube code smell. * Per #2880, many changes to the vx_nc_obs library and point2grid application to replace dynamically allocated memory with STL vectors to satisfy SonarQube code smells. * Per #2880, fix compilation error. * Per #2880, fix compilation error * Per #2880, revert skip_times back to vector since I wasn't postive the casting (int *) to (bool *) would actually work in the way I expect. This is safer. * Per #2880, rename the -qc command line option as -goes_qc, but still quietly support -qc * Per #2880, based on guidance from @hsoh, reset the var_cell_mapping vector for each loop iteration. * Spelling correction * Feature #2882 seeps qa (#2987) * Update seeps.h Change variable names to reduce ambiguity for interpretation and aid useability. * Update seeps.cc Pull through variable name changes and renaming of functions to aid legibility and clarity. Introduced some additional debug print statements. * Update grid-stat.rst Add documentation about the location of the gridded climatology files for SEEPS and which environment variable to use. * Replace read_seeps_scores() with get_seeps_climo_grid() * Manually merging Rachel's patch-1 changes. * Getting close to getting these seeps changes to compile. But it's failing in pair_data_point.cc * Per #2882, get branch feature_2882_seeps_qa compiling again. Recommend revisiting the volume of SEEPS-related Debug log messages and reducing them once its fully tested. * Per #2882, need to update the handling of the PPT24_seepsweights_grid.nc file name. Rename as _v12.0.nc for the updated version with the new names so that the existing regressions tests and nightly builds for main_v11.1 and develop continue to work. We can remove the _v12.0 once this feature branch is merged into develop but for the time being, we need both versions to exist. * Per #2882, rename the SEEPS columns from S12, S13, S21, S23, S31, S32 to the more descriptive ODFL, ODFH, OLFD, OLFH, OHFD, OHFL names. * Per #2882, update SEEPS details * Per #2882, store and report the weighted mean fcst and mean obs, just like the SEEPS score itself so that they're handled in a consistent manner. Note however that it's hard-coded to NOT write the weighted means/score, only the unweighted ones. * Per #2882, change SEEPS debug log levels and correct the storage of mean_fcst and mean_obs values. * Per #2882, correct SEEPS column name lookups * Per #2882, call is_bad_data() instead of is_eq(..., -9999.0) to get rid of compiler warning message. * Per #2882, add 2 more variations of the is_eq() function with mixed float and double inputs to satisfy compiler pb2nc compiler warnings. * Per #2882, switch from dynamically allocated arrays to std::vector * Per #2882, enhance Stat-Analysis to write the SEEPS line type to an output .stat file. * Per #2882, update the aggregated seeps computation to use better-initialized vectors. * Per #2882, resolve a few more SonarQube code smells. * Per #2882, now that this PR is ready to merge, remove the v12.0 version number from the gridded SEEPS climo file name ci-skip-all --------- Co-authored-by: mpm-meto <64001904+mpm-meto@users.noreply.github.com> * Hotfix to the develop branch for unit_grid_stat.xml to use the updated gridded seeps climo file name. * Feature #2887 categorical weights (#2988) * Per #2887, update NumArray::vals() to return a reference to the vector rather a pointer to doubles. * Per #2887, switch over the whole ContingencyTable class heirarchy from storing integer counts to storing double-precision weights. * Add ContingencyTable::is_integer() member function to check whether the table contains all integers * Per #2887, update parse_stat_line.cc to get it to compile after changing PCT to store thresholds in a std::vector. * Per #2887, update PCTInfo::clear() logic. * Per #2887, update ctc_by_row() logic to create reproducible results with the develop branch. * Per #2887, update logic of define_prob_bins() to add a final >=1.0 threshold if needed. While ==0.1 works fine, I found that ==0.05 did not because the last >=1.0 threshold was missing likely do to floating point precision issues. This change should fix that problem. * Per #2887, update roc_auc() function to match the develop branch * Per #2887, fix bug if computation of far() * Per #2887, replaced all ==0 integer equality checks with calls to is_eq() instead and fix a couple of equations to snuff out diffs in some CTS statistics. * Per #2887, address some of the 34 SonarQube code smells flagged for this PR. Note that the compute_ci.h/.cc changes are necessary and good since we should be computing CI's using doubles instead of integer counts. * Per #2887, update run_sonarqube.sh to specify the target CXX standard as 11. The hope is that that will limit the findings to only those features available in the C++11 standard. * Per #2887, update to SonarQube version 6.1.0.4477 released on 6/27/2024. * Per #2887, updating build_met_sonarqube.sh to specify --std=c++11 since c++17 is used by default * Per #2887, swap in a much simpler implementation of the ORSS statistic to match the equation listed in the MET User's Guide. * Per #2887, update grid_stat and library code to actually apply the grid_weight_flag settings to the computation of contingency table counts and statistics. * Per #2887, fix the handling of bad data in the ORSS equation. * Per #2887, add Npairs member to the ContingencyTable class, eliminate the n() accessor function, and carefully replace references to n() with n_pairs() for the integer number of matched pairs or total() with the double-precision sum of the weights. * Per #2887, reset Npairs = 0 for ContingencyTable::zero_out() * Per #2883, need to call set_n_pairs() in a few spots to set ECLV TOTAL column correctly ci-run-unit * Per #2887, call set_n_pairs() when aggregating PCT data in Series-Analysis ci-run-unit * Per #2887, update stat_analysis to parse the TOTAL column for the PCT and MCTC line types. * Pet #2882, call set_n_pairs() after set_size() ci-run-unit * Per #2887, reconfigure existing Ensemble-Stat unit test to request probabilistic output to see that it's impacted by the grid_weight_flag setting. * Per #2887, update Ensemble-Stat test to provide climo stdev data * Per #2887, add grid_weight_flag to the list of config options for Grid-Stat and Ensemble-Stat. * Per #2887, disable FHO output if grid_weight_flag != NONE. * Per #2887, revise the existing unit_grid_weight.xml unit tests for Grid-Stat to write CTC/CTS/MCTC/MCTS output and for the DESC column to be populated to indicate the type of grid weighting that was applied. * Per #2887, relatively small changes to drive down SonarQube code smells. Also, switch from total() to n_pairs() when computing confidence intervals. * Per #2887, more SonarQube tweaks * Per #2887, more SonarQube tweaks. * Per #2887, more SonarQube tweaks. * Per #2887, whitespace only changes. * Per #2287, fix path the seeps climo grid. * Per #2887, update the grid_weight_flag documentation. * Per #2887, tweak the wording. --------- Co-authored-by: Howard Soh Co-authored-by: John Halley Gotway Co-authored-by: Howard Soh Co-authored-by: MET Tools Test Account Co-authored-by: davidalbo Co-authored-by: j-opatz Co-authored-by: Daniel Adriaansen Co-authored-by: Julie Prestopnik Co-authored-by: George McCabe <23407799+georgemccabe@users.noreply.github.com> Co-authored-by: natalieb-noaa <146213121+natalieb-noaa@users.noreply.github.com> Co-authored-by: Natalie babij Co-authored-by: Natalie babij Co-authored-by: Natalie babij Co-authored-by: Natalie Babij Co-authored-by: Julie Prestopnik Co-authored-by: lisagoodrich <33230218+lisagoodrich@users.noreply.github.com> Co-authored-by: mpm-meto <64001904+mpm-meto@users.noreply.github.com> Co-authored-by: metplus-bot <97135045+metplus-bot@users.noreply.github.com> --- docs/Users_Guide/appendixF.rst | 2 +- docs/Users_Guide/config_options.rst | 590 ++++---- docs/Users_Guide/ensemble-stat.rst | 41 +- docs/Users_Guide/grid-stat.rst | 51 +- .../config/EnsembleStatConfig_grid_weight | 19 +- .../config/GridStatConfig_grid_weight | 14 +- internal/test_unit/t | 114 ++ internal/test_unit/unit_test.log | 1225 +++++++++++++++++ internal/test_unit/xml/unit_grid_weight.xml | 21 +- src/libcode/vx_stat_out/stat_columns.cc | 180 ++- src/libcode/vx_statistics/compute_ci.cc | 12 +- src/libcode/vx_statistics/compute_ci.h | 6 +- src/libcode/vx_statistics/compute_stats.cc | 12 +- src/libcode/vx_statistics/contable.cc | 51 +- src/libcode/vx_statistics/contable.h | 41 +- src/libcode/vx_statistics/contable_nx2.cc | 73 +- src/libcode/vx_statistics/contable_stats.cc | 71 +- src/libcode/vx_statistics/met_stats.cc | 91 +- src/libcode/vx_statistics/met_stats.h | 6 +- src/tools/core/ensemble_stat/ensemble_stat.cc | 1 + src/tools/core/grid_stat/grid_stat.cc | 33 +- .../core/grid_stat/grid_stat_conf_info.cc | 13 + src/tools/core/mode/mode_exec.cc | 2 +- src/tools/core/point_stat/point_stat.cc | 12 +- .../core/series_analysis/series_analysis.cc | 12 +- .../core/stat_analysis/aggr_stat_line.cc | 315 ++--- .../core/stat_analysis/parse_stat_line.cc | 88 +- src/tools/core/wavelet_stat/wavelet_stat.cc | 1 + src/tools/tc_utils/tc_stat/tc_stat_job.cc | 13 +- 29 files changed, 2235 insertions(+), 875 deletions(-) create mode 100755 internal/test_unit/t create mode 100644 internal/test_unit/unit_test.log diff --git a/docs/Users_Guide/appendixF.rst b/docs/Users_Guide/appendixF.rst index bc051f5a14..17628c7ee5 100644 --- a/docs/Users_Guide/appendixF.rst +++ b/docs/Users_Guide/appendixF.rst @@ -355,7 +355,7 @@ The first argument for the Plot-Data-Plane tool is the gridded data file to be r 'level': 'Surface', 'units': 'None', 'init': '20050807_000000', 'valid': '20050807_120000', 'lead': '120000', 'accum': '120000' - 'grid': {...} } + 'grid': { ... } } DEBUG 1: Creating postscript file: fcst.ps Special Case for Ensemble-Stat, Series-Analysis, and MTD diff --git a/docs/Users_Guide/config_options.rst b/docs/Users_Guide/config_options.rst index c9a2ba01f6..25515fc524 100644 --- a/docs/Users_Guide/config_options.rst +++ b/docs/Users_Guide/config_options.rst @@ -21,42 +21,42 @@ which are dictionaries themselves. The configuration file language supports the following data types: * Dictionary: - + * Grouping of one or more entries enclosed by curly braces {}. * Array: - + * List of one or more entries enclosed by square braces []. - + * Array elements are separated by commas. * String: - + * A character string enclosed by double quotation marks "". - + * Integer: - + * A numeric integer value. - + * Float: - + * A numeric float value. - + * Boolean: - + * A boolean value (TRUE or FALSE). - + * Threshold: - + * A threshold type (<, <=, ==, !-, >=, or >) followed by a numeric value. - + * The threshold type may also be specified using two letter abbreviations (lt, le, eq, ne, ge, gt). - + * Multiple thresholds may be combined by specifying the logic type of AND (&&) or OR (||). For example, ">=5&&<=10" defines the numbers between 5 and 10 and "==1||==2" defines numbers exactly equal to 1 or 2. - + * Percentile Thresholds: * A threshold type (<, <=, ==, !=, >=, or >), followed by a percentile @@ -65,34 +65,34 @@ The configuration file language supports the following data types: * Note that the two letter threshold type abbreviations (lt, le, eq, ne, ge, gt) are not supported for percentile thresholds. - + * Thresholds may be defined as percentiles of the data being processed in several places: - + * In Point-Stat and Grid-Stat when setting "cat_thresh", "wind_thresh" and "cnt_thresh". - + * In Wavelet-Stat when setting "cat_thresh". - + * In MODE when setting "conv_thresh" and "merge_thresh". - + * In Ensemble-Stat when setting "obs_thresh". - + * When using the "censor_thresh" config option. - + * In the Stat-Analysis "-out_fcst_thresh" and "-out_obs_thresh" job command options. - + * In the Gen-Vx-Mask "-thresh" command line option. - + * The following percentile threshold types are supported: - + * SFP for a percentile of the sample forecast values. e.g. ">SFP33.3" means greater than the 33.3-rd forecast percentile. - + * SOP for a percentile of the sample observation values. e.g. ">SOP75" means greater than the 75-th observation percentile. - + * SFCP for a percentile of the sample forecast climatology values. e.g. ">SFCP90" means greater than the 90-th forecast climatology percentile. @@ -101,11 +101,11 @@ The configuration file language supports the following data types: e.g. ">SOCP90" means greater than the 90-th observation climatology percentile. For backward compatibility, the "SCP" threshold type is processed the same as "SOCP". - + * USP for a user-specified percentile threshold. e.g. "5.0 threshold to the observations and then chooses a forecast threshold which results in a frequency bias of 1. The frequency bias can be any float value > 0.0. - + * FCDP for forecast climatological distribution percentile thresholds. These thresholds require that the forecast climatological mean and standard deviation be defined using the "climo_mean" and "climo_stdev" @@ -130,7 +130,7 @@ The configuration file language supports the following data types: However these thresholds are defined using the observation climatological mean and standard deviation rather than the forecast climatological data. For backward compatibility, the "CDP" threshold type is processed the - same as "OCDP". + same as "OCDP". * When percentile thresholds of type SFP, SOP, SFCP, SOCP, FCDP, or OCDP are requested for continuous filtering thresholds (cnt_thresh), wind speed @@ -140,13 +140,13 @@ The configuration file language supports the following data types: bins which span the values from 0 to 100. For example, ==OCDP25 is automatically expanded to 4 percentile bins: >=OCDP0&&=OCDP25&&=OCDP50&&=OCDP75&&<=OCDP100 - + * When sample percentile thresholds of type SFP, SOP, SFCP, SOCP, or FBIAS are requested, MET recomputes the actual percentile that the threshold represents. If the requested percentile and actual percentile differ by more than 5%, a warning message is printed. This may occur when the sample size is small or the data values are not truly continuous. - + * When percentile thresholds of type SFP, SOP, SFCP, SOCP, or USP are used, the actual threshold value is appended to the FCST_THRESH and OBS_THRESH output columns. For example, if the 90-th percentile of the current set @@ -167,20 +167,20 @@ The configuration file language supports the following data types: Users are encouraged to replace the deprecated SCP and CDP threshold types with the updated SOCP and OCDP types, respectively. - + * Piecewise-Linear Function (currently used only by MODE): - + * A list of (x, y) points enclosed in parenthesis (). - + * The (x, y) points are *NOT* separated by commas. - + * User-defined function of a single variable: - + * Left side is a function name followed by variable name in parenthesis. - + * Right side is an equation which includes basic math functions (+,-,*,/), built-in functions (listed below), or other user-defined functions. - + * Built-in functions include: sin, cos, tan, sind, cosd, tand, asin, acos, atan, asind, acosd, atand, atan2, atan2d, arg, argd, log, exp, log10, exp10, sqrt, abs, min, max, @@ -401,7 +401,7 @@ References: | `Office Note 388 GRIB1 `_ | `A Guide to the Code Form FM 92-IX Ext. GRIB Edition 1 `_ -| +| GRIB2 table files begin with "grib2" prefix and end with a ".txt" suffix. The first line of the file must contain GRIB2. @@ -418,7 +418,7 @@ The following lines consist of 8 integers followed by 3 strings. | Column 9: variable name | Column 10: variable description | Column 11: units -| +| References: @@ -502,7 +502,7 @@ parallelization: * :code:`grid_ens_prod` * :code:`mode` -**Thread Binding** +**Thread Binding** It is normally beneficial to bind threads to particular cores, sometimes called *affinitization*. There are a few reasons for this, but at the very least it @@ -618,7 +618,7 @@ writing of NetCDF files within MET significantly. output_precision ---------------- - + The "output_precision" entry in ConfigConstants defines the precision (number of significant decimal places) to be written to the ASCII output files. Setting this option in the config file of one of the tools will @@ -632,7 +632,7 @@ override the default value set in ConfigConstants. tmp_dir ------- - + The "tmp_dir" entry in ConfigConstants defines the directory for the temporary files. The directory must exist and be writable. The environment variable MET_TMP_DIR overrides the default value at the configuration file. @@ -669,7 +669,7 @@ used. message_type_map ---------------- - + The "message_type_map" entry is an array of dictionaries, each containing a "key" string and "val" string. This defines a mapping of input strings to output message types. This mapping is applied in ASCII2NC when @@ -693,7 +693,7 @@ types. model ----- - + The "model" entry specifies a name for the model being verified. This name is written to the MODEL column of the ASCII output generated. If you're verifying multiple models, you should choose descriptive model names (no @@ -706,7 +706,7 @@ e.g. model = "GFS"; desc ---- - + The "desc" entry specifies a user-specified description for each verification task. This string is written to the DESC column of the ASCII output generated. It may be set separately in each "obs.field" verification task @@ -736,10 +736,10 @@ the configuration file obtype value is written. obtype = "ANALYS"; .. _regrid: - + regrid ------ - + The "regrid" entry is a dictionary containing information about how to handle input gridded data files. The "regrid" entry specifies regridding logic using the following entries: @@ -747,17 +747,17 @@ using the following entries: * The "to_grid" entry may be set to NONE, FCST, OBS, a named grid, the path to a gridded data file defining the grid, or an explicit grid specification string. - + * to_grid = NONE; To disable regridding. - + * to_grid = FCST; To regrid observations to the forecast grid. - + * to_grid = OBS; To regrid forecasts to the observation grid. - + * to_grid = "G218"; To regrid both to a named grid. - + * to_grid = "path"; To regrid both to a grid defined by a file. - + * to_grid = "spec"; To define a grid specification string, as described in :ref:`appendixB`. @@ -768,29 +768,29 @@ using the following entries: write bad data for the current point. * The "method" entry defines the regridding method to be used. - + * Valid regridding methods: - + * MIN for the minimum value - + * MAX for the maximum value - + * MEDIAN for the median value - + * UW_MEAN for the unweighted average value - + * DW_MEAN for the distance-weighted average value (weight = distance^-2) - + * AW_MEAN for an area-weighted mean when regridding from high to low resolution grids (width = 1) - + * LS_FIT for a least-squares fit - + * BILIN for bilinear interpolation (width = 2) - + * NEAREST for the nearest grid point (width = 1) - + * BUDGET for the mass-conserving budget interpolation * The budget interpolation method is often used for precipitation @@ -806,15 +806,15 @@ using the following entries: * FORCE to compare gridded data directly with no interpolation as long as the grid x and y dimensions match. - + * UPPER_LEFT for the upper left grid point (width = 1) - + * UPPER_RIGHT for the upper right grid point (width = 1) - + * LOWER_RIGHT for the lower right grid point (width = 1) - + * LOWER_LEFT for the lower left grid point (width = 1) - + * MAXGAUSS to compute the maximum value in the neighborhood and apply a Gaussian smoother to the result @@ -842,7 +842,7 @@ using the following entries: regridding step. The conversion operation is applied first, followed by the censoring operation. Note that these operations are limited in scope. They are only applied if defined within the regrid dictionary itself. - Settings defined at higher levels of config file context are not applied. + Settings defined at higher levels of config file context are not applied. .. code-block:: none @@ -861,7 +861,7 @@ using the following entries: fcst ---- - + The "fcst" entry is a dictionary containing information about the field(s) to be verified. This dictionary may include the following entries: @@ -1046,7 +1046,7 @@ to be verified. This dictionary may include the following entries: the analysis. For example, the following settings exclude matched pairs where the observation value differs from the forecast or climatological mean values by more than 10: - + .. code-block:: none mpr_column = [ "ABS(OBS-FCST)", "ABS(OBS-CLIMO_MEAN)" ]; @@ -1144,70 +1144,70 @@ File-format specific settings for the "field" entry: * `GRIB1 Product Definition Section `_ * `GRIB2 Product Definition Section `_ - + * The "level" entry specifies a level type and value: - + * ANNN for accumulation interval NNN - + * ZNNN for vertical level NNN - + * ZNNN-NNN for a range of vertical levels - + * PNNN for pressure level NNN in hPa - + * PNNN-NNN for a range of pressure levels in hPa - + * LNNN for a generic level type - + * RNNN for a specific GRIB record number - + * The "GRIB_lvl_typ" entry is an integer specifying the level type. - + * The "GRIB_lvl_val1" and "GRIB_lvl_val2" entries are floats specifying the first and second level values. - + * The "GRIB_ens" entry is a string specifying NCEP's usage of the extended PDS for ensembles. Set to "hi_res_ctl", "low_res_ctl", "+n", or "-n", for the n-th ensemble member. - + * The GRIB1_ptv entry is an integer specifying the GRIB1 parameter table version number. - + * The GRIB1_code entry is an integer specifying the GRIB1 code (wgrib kpds5 value). - + * The GRIB1_center is an integer specifying the originating center. - + * The GRIB1_subcenter is an integer specifying the originating subcenter. - + * The GRIB1_tri is an integer specifying the time range indicator. - + * The GRIB2_mtab is an integer specifying the master table number. - + * The GRIB2_ltab is an integer specifying the local table number. - + * The GRIB2_disc is an integer specifying the GRIB2 discipline code. - + * The GRIB2_parm_cat is an integer specifying the parameter category code. - + * The GRIB2_parm is an integer specifying the parameter code. - + * The GRIB2_pdt is an integer specifying the product definition template (Table 4.0). - + * The GRIB2_process is an integer specifying the generating process (Table 4.3). - + * The GRIB2_cntr is an integer specifying the originating center. - + * The GRIB2_ens_type is an integer specifying the ensemble type (Table 4.6). - + * The GRIB2_der_type is an integer specifying the derived product type (Table 4.7). - + * The GRIB2_stat_type is an integer specifying the statistical processing type (Table 4.10). @@ -1234,13 +1234,13 @@ File-format specific settings for the "field" entry: template values are 1 and 2, respectively: GRIB2_ipdtmpl_index=[8, 26]; GRIB2_ipdtmpl_val=[1, 2]; - + * NetCDF (from MET tools, CF-compliant, p_interp, and wrf_interp): - + * The "name" entry specifies the NetCDF variable name. - + * The "level" entry specifies the dimensions to be used: - + * (i,...,j,*,*) for a single field, where i,...,j specifies fixed dimension values and *,* specifies the two dimensions for the gridded field. @ specifies the vertical level value or time value @@ -1271,10 +1271,10 @@ File-format specific settings for the "field" entry: ]; * Python (using PYTHON_NUMPY or PYTHON_XARRAY): - + * The Python interface for MET is described in Appendix F of the MET User's Guide. - + * Two methods for specifying the Python command and input file name are supported. For tools which read a single gridded forecast and/or observation file, both options work. However, only the second option @@ -1282,13 +1282,13 @@ File-format specific settings for the "field" entry: as Ensemble-Stat, Series-Analysis, and MTD. Option 1: - + * On the command line, replace the path to the input gridded data file with the constant string PYTHON_NUMPY or PYTHON_XARRAY. - + * Specify the configuration "name" entry as the Python command to be executed to read the data. - + * The "level" entry is not required for Python. For example: @@ -1303,14 +1303,14 @@ File-format specific settings for the "field" entry: * On the command line, leave the path to the input gridded data as is. - + * Set the configuration "file_type" entry to the constant PYTHON_NUMPY or PYTHON_XARRAY. - + * Specify the configuration "name" entry as the Python command to be executed to read the data, but replace the input gridded data file with the constant MET_PYTHON_INPUT_ARG. - + * The "level" entry is not required for Python. For example: @@ -1337,7 +1337,7 @@ File-format specific settings for the "field" entry: init_time = "20120619_12"; valid_time = "20120620_00"; lead_time = "12"; - + field = [ { name = "APCP"; @@ -1453,16 +1453,16 @@ or that filtering by station ID may also be accomplished using the "mask.sid" option. However, when using the "sid_inc" option, statistics are reported separately for each masking region. - + * The "sid_exc" entry is an array of station ID groups indicating which station ID's should be excluded from the verification task. - + * Each element in the "sid_inc" and "sid_exc" arrays is either the name of a single station ID or the full path to a station ID group file name. A station ID group file consists of a name for the group followed by a list of station ID's. All of the station ID's indicated will be concatenated into one long list of station ID's to be included or excluded. - + * As with "message_type" above, the "sid_inc" and "sid_exc" settings can be placed in the in the "field" array element to control which station ID's are included or excluded for each verification task. @@ -1473,7 +1473,7 @@ or climo_mean ---------- - + The "climo_mean" dictionary specifies climatology mean data to be read by the Grid-Stat, Point-Stat, Ensemble-Stat, and Series-Analysis tools. It can be set inside the "fcst" and "obs" dictionaries to specify separate forecast and @@ -1496,7 +1496,7 @@ the climatology file names and fields to be used. * The "time_interp_method" entry specifies how the climatology data should be interpolated in time to the forecast valid time: - + * NEAREST for data closest in time * UW_MEAN for average of data before and after * DW_MEAN for linear interpolation in time of data before and after @@ -1519,16 +1519,16 @@ the climatology file names and fields to be used. .. code-block:: none climo_mean = { - + file_name = [ "/path/to/climatological/mean/files" ]; field = []; - + regrid = { method = NEAREST; width = 1; vld_thresh = 0.5; } - + time_interp_method = DW_MEAN; day_interval = 31; hour_interval = 6; @@ -1536,7 +1536,7 @@ the climatology file names and fields to be used. climo_stdev ----------- - + The "climo_stdev" dictionary specifies climatology standard deviation data to be read by the Grid-Stat, Point-Stat, Ensemble-Stat, and Series-Analysis tools. It can be set inside the "fcst" and "obs" dictionaries to specify @@ -1591,7 +1591,7 @@ dictionaries, as shown below. climo_cdf --------- - + The "climo_cdf" dictionary specifies how the the observation climatological mean ("climo_mean") and standard deviation ("climo_stdev") data are used to evaluate model performance relative to where the observation value falls @@ -1723,11 +1723,11 @@ The "mask_missing_flag" entry specifies how missing data should be handled in the Wavelet-Stat and MODE tools: * NONE to perform no masking of missing data - + * FCST to mask the forecast field with missing observation data - + * OBS to mask the observation field with missing forecast data - + * BOTH to mask both fields with missing data from the other .. code-block:: none @@ -1770,7 +1770,7 @@ in the following ways: three digit grid number. Supplying a value of "FULL" indicates that the verification should be performed over the entire grid on which the data resides. - See: `ON388 - TABLE B, GRID IDENTIFICATION (PDS Octet 7), MASTER LIST OF NCEP STORAGE GRIDS, GRIB Edition 1 (FM92) `_. + See: `ON388 - TABLE B, GRID IDENTIFICATION (PDS Octet 7), MASTER LIST OF NCEP STORAGE GRIDS, GRIB Edition 1 (FM92) `_. The "grid" entry can be the gridded data file defining grid. * The "poly" entry contains a comma-separated list of files that define @@ -1850,7 +1850,7 @@ in the following ways: * The "sid" entry is an array of strings which define groups of observation station ID's over which to compute statistics. Each entry in the array is either a filename of a comma-separated list. - + * For a filename, the strings are whitespace-separated. The first string is the mask "name" and the remaining strings are the station ID's to be used. @@ -1929,10 +1929,10 @@ bootstrap confidence intervals. The interval variable indicates what method should be used for computing bootstrap confidence intervals: * The "interval" entry specifies the confidence interval method: - + * BCA for the BCa (bias-corrected percentile) interval method is highly accurate but computationally intensive. - + * PCTILE uses the percentile method which is somewhat less accurate but more efficient. @@ -1965,7 +1965,7 @@ should be used for computing bootstrap confidence intervals: documentation of the `GNU Scientific Library `_ for a listing of the random number generators available for use. - + * The "seed" entry may be set to a specific value to make the computation of bootstrap confidence intervals fully repeatable. When left empty the random number generator seed is chosen automatically which will lead @@ -1994,11 +1994,11 @@ This dictionary may include the following entries: * The "field" entry specifies to which field(s) the interpolation method should be applied. This does not apply when doing point verification with the Point-Stat or Ensemble-Stat tools: - + * FCST to interpolate/smooth the forecast field. - + * OBS to interpolate/smooth the observation field. - + * BOTH to interpolate/smooth both the forecast and the observation. * The "vld_thresh" entry specifies a number between 0 and 1. When @@ -2033,38 +2033,38 @@ This dictionary may include the following entries: * The "method" entry is an array of interpolation procedures to be applied to the points in the box: - + * MIN for the minimum value - + * MAX for the maximum value - + * MEDIAN for the median value - + * UW_MEAN for the unweighted average value - + * DW_MEAN for the distance-weighted average value where weight = distance^-2 * LS_FIT for a least-squares fit - + * BILIN for bilinear interpolation (width = 2) - + * NEAREST for the nearest grid point (width = 1) - + * BEST for the value closest to the observation - + * UPPER_LEFT for the upper left grid point (width = 1) * UPPER_RIGHT for the upper right grid point (width = 1) - + * LOWER_RIGHT for the lower right grid point (width = 1) - + * LOWER_LEFT for the lower left grid point (width = 1) * GAUSSIAN for the Gaussian kernel * MAXGAUSS for the maximum value followed by a Gaussian smoother - + * GEOG_MATCH for the nearest grid point where the land/sea mask and geography criteria are satisfied @@ -2096,7 +2096,7 @@ This dictionary may include the following entries: land_mask --------- - + The "land_mask" dictionary defines the land/sea mask field used when verifying at the surface. The "flag" entry enables/disables this logic. When enabled, the "message_type_group_map" dictionary must contain entries @@ -2124,7 +2124,7 @@ The "land_mask.flag" entry may be set separately in each "obs.field" entry. topo_mask --------- - + The "topo_mask" dictionary defines the model topography field used when verifying at the surface. The flag entry enables/disables this logic. When enabled, the "message_type_group_map" dictionary must contain an entry @@ -2154,7 +2154,7 @@ The "topo_mask.flag" entry may be set separately in each "obs.field" entry. hira ---- - + The "hira" entry is a dictionary that is very similar to the "interp" and "nbrhd" entries. It specifies information for applying the High Resolution Assessment (HiRA) verification logic in Point-Stat. HiRA is analogous to @@ -2207,15 +2207,15 @@ This dictionary may include the following entries: output_flag ----------- - + The "output_flag" entry is a dictionary that specifies what verification methods should be applied to the input data. Options exist for each output line type from the MET tools. Each line type may be set to one of: * NONE to skip the corresponding verification method - + * STAT to write the verification output only to the ".stat" output file - + * BOTH to write to the ".stat" output file as well the optional "_type.txt" file, a more readable ASCII file sorted by line type. @@ -2289,7 +2289,7 @@ netcdf output will be generated. nc_pairs_var_name ----------------- - + The "nc_pairs_var_name" entry specifies a string for each verification task in Grid-Stat. This string is parsed from each "obs.field" dictionary entry and is used to construct variable names for the NetCDF matched pairs output @@ -2302,14 +2302,14 @@ For example: | nc_pairs_var_name = "TMP"; | - + .. code-block:: none nc_pairs_var_name = ""; nc_pairs_var_suffix ------------------- - + The "nc_pairs_var_suffix" entry is similar to the "nc_pairs_var_name" entry described above. It is also parsed from each "obs.field" dictionary entry. However, it defines a suffix to be appended to the output variable name. @@ -2334,7 +2334,7 @@ For example: ps_plot_flag ------------ - + The "ps_plot_flag" entry is a boolean value for Wavelet-Stat and MODE indicating whether a PostScript plot should be generated summarizing the verification. @@ -2345,23 +2345,47 @@ the verification. grid_weight_flag ---------------- - + The "grid_weight_flag" specifies how grid weighting should be applied -during the computation of continuous statistics and partial sums. It is -meant to account for grid box area distortion and is often applied to global -Lat/Lon grids. It is only applied for grid-to-grid verification in Grid-Stat -and Ensemble-Stat and is not applied for grid-to-point verification. +during the computation of contingency tables (CTC, MCTC, PCT, and +NBRCTC), partial sums (SL1L2, SAL1L2, VL1L2, and VAL1L2), and statistics +(CNT, CTS, MCTS, PSTD, PRC, PJC, ECLV, NBRCNT, and NBRCTS). +It is meant to account for grid box area distortion and is often applied +to global Lat/Lon grids. It is only applied for grid-to-grid verification +in Grid-Stat and Ensemble-Stat and is not applied for grid-to-point +verification. It can only be defined once at the highest level of config +file context and applies to all verification tasks for that run. + Three grid weighting options are currently supported: -* NONE to disable grid weighting using a constant weight (default). - +* NONE to disable grid weighting using a constant weight of 1.0 (default). + * COS_LAT to define the weight as the cosine of the grid point latitude. This an approximation for grid box area used by NCEP and WMO. - + * AREA to define the weight as the true area of the grid box (km^2). -The weights are ultimately computed as the weight at each grid point divided -by the sum of the weights for the current masking region. +If requested in the config file, the raw grid weights can be written to +the NetCDF output from Grid-Stat and Ensemble-Stat. + +When computing partial sums and continuous statistics, the weights are +first normalized by dividing by the sum of the weights for the current +masking region. When computing contingency tables and deriving statistics, +each contingency table cell contains the sum of the weights of the matching +grid points rather than the integer count of those grid points. Statistics +are derived using these sums of weights rather than the raw counts. + +When no grid weighting is requested (**NONE**), contingency tables are +populated using a default constant weight of 1.0 and the corresponding cells +are written to the output as integer counts for consistency with earlier +versions of MET. + +.. note:: + + The FHO line type is not compatible with grid weighting. If requested + with grid weighting enabled, Grid-Stat prints a warning message and + automatically disables the FHO line type. Users are advised to request the + CTC line type instead. .. code-block:: none @@ -2404,7 +2428,7 @@ The "duplicate_flag" entry specifies how to handle duplicate point observations in Point-Stat and Ensemble-Stat: * NONE to use all point observations (legacy behavior) - + * UNIQUE only use a single observation if two or more observations match. Matching observations are determined if they contain identical latitude, longitude, level, elevation, and time information. @@ -2428,21 +2452,21 @@ in Point-Stat and Ensemble-Stat. Eight techniques are currently supported: * NONE to use all point observations (legacy behavior) - + * NEAREST use only the observation that has the valid time closest to the forecast valid time - + * MIN use only the observation that has the lowest value - + * MAX use only the observation that has the highest value - + * UW_MEAN compute an unweighted mean of the observations - + * DW_MEAN compute a weighted mean of the observations based on the time of the observation - + * MEDIAN use the median observation - + * PERC use the Nth percentile observation where N = obs_perc_value The reporting mechanism for this feature can be activated by specifying @@ -2457,14 +2481,14 @@ in those cases. obs_perc_value -------------- - + Percentile value to use when obs_summary = PERC .. code-block:: none obs_perc_value = 50; - + obs_quality_inc --------------- @@ -2480,7 +2504,7 @@ Note "obs_quality_inc" replaces the older option "obs_quality". obs_quality_inc = [ "1", "2", "3", "9" ]; - + obs_quality_exc --------------- @@ -2495,7 +2519,7 @@ an array of strings, even if the values themselves are numeric. obs_quality_exc = [ "1", "2", "3", "9" ]; - + met_data_dir ------------ @@ -2685,7 +2709,7 @@ entries. This dictionary may include the following entries: censor_val = []; ens_thresh = 1.0; vld_thresh = 1.0; - + field = [ { name = "APCP"; @@ -2746,37 +2770,37 @@ combination of the categorical threshold (cat_thresh), neighborhood width ensemble_flag ^^^^^^^^^^^^^ - + The "ensemble_flag" entry is a dictionary of boolean value indicating which ensemble products should be generated: * "latlon" for a grid of the Latitude and Longitude fields * "mean" for the simple ensemble mean - + * "stdev" for the ensemble standard deviation - + * "minus" for the mean minus one standard deviation - + * "plus" for the mean plus one standard deviation - + * "min" for the ensemble minimum - + * "max" for the ensemble maximum - + * "range" for the range of ensemble values - + * "vld_count" for the number of valid ensemble members - + * "frequency" for the ensemble relative frequency meeting a threshold - + * "nep" for the neighborhood ensemble probability - + * "nmep" for the neighborhood maximum ensemble probability - + * "rank" to write the rank for the gridded observation field to separate NetCDF output file. - + * "weight" to write the grid weights specified in grid_weight_flag to the rank NetCDF output file. @@ -2798,7 +2822,7 @@ which ensemble products should be generated: rank = TRUE; weight = FALSE; } - + EnsembleStatConfig_default -------------------------- @@ -2831,7 +2855,7 @@ data is provided, the climo_cdf thresholds will be used instead. ens_ssvar_bin_size = 1; ens_phist_bin_size = 0.05; prob_cat_thresh = []; - + field = [ { name = "APCP"; @@ -2916,7 +2940,7 @@ CHISQUARED distributions are defined by a single parameter. The GAMMA, UNIFORM, and BETA distributions are defined by two parameters. See the `GNU Scientific Library Reference Manual `_ for more information on these distributions. - + The inst_bias_scale and inst_bias_offset entries specify bias scale and offset values that should be applied to observation values prior to @@ -3211,85 +3235,85 @@ MET User's Guide for a description of these attributes. // centroid_x_min = 0.0; // centroid_x_max = 0.0; - + // centroid_y_min = 0.0; // centroid_y_max = 0.0; - + // centroid_lat_min = 0.0; // centroid_lat_max = 0.0; - + // centroid_lon_min = 0.0; // centroid_lon_max = 0.0; - + // axis_ang_min = 0.0; // axis_ang_max = 0.0; - + // length_min = 0.0; // length_max = 0.0; - + // width_min = 0.0; // width_max = 0.0; - + // aspect_ratio_min = 0.0; // aspect_ratio_max = 0.0; - + // curvature_min = 0.0; // curvature_max = 0.0; - + // curvature_x_min = 0.0; // curvature_x_max = 0.0; - + // curvature_y_min = 0.0; // curvature_y_max = 0.0; - + // complexity_min = 0.0; // complexity_max = 0.0; - + // intensity_10_min = 0.0; // intensity_10_max = 0.0; - + // intensity_25_min = 0.0; // intensity_25_max = 0.0; // intensity_50_min = 0.0; // intensity_50_max = 0.0; - + // intensity_75_min = 0.0; // intensity_75_max = 0.0; - + // intensity_90_min = 0.0; // intensity_90_max = 0.0; - + // intensity_user_min = 0.0; // intensity_user_max = 0.0; - + // intensity_sum_min = 0.0; // intensity_sum_max = 0.0; - + // centroid_dist_min = 0.0; // centroid_dist_max = 0.0; - + // boundary_dist_min = 0.0; // boundary_dist_max = 0.0; - + // convex_hull_dist_min = 0.0; // convex_hull_dist_max = 0.0; - + // angle_diff_min = 0.0; // angle_diff_max = 0.0; - + // area_ratio_min = 0.0; // area_ratio_max = 0.0; - + // intersection_over_area_min = 0.0; // intersection_over_area_max = 0.0; - + // complexity_ratio_min = 0.0; // complexity_ratio_max = 0.0; - + // percentile_intensity_ratio_min = 0.0; // percentile_intensity_ratio_max = 0.0; - + // interest_min = 0.0; // interest_max = 0.0; @@ -3370,14 +3394,14 @@ The object definition settings for MODE are contained within the "fcst" and merge_thresh = [ >=1.0, >=2.0, >=3.0 ]; * The "merge_flag" entry specifies the merging methods to be applied: - + * NONE for no merging - + * THRESH for the double-threshold merging method. Merge objects that would be part of the same object at the lower threshold. - + * ENGINE for the fuzzy logic approach comparing the field to itself - + * BOTH for both the double-threshold and engine merging methods .. code-block:: none @@ -3387,7 +3411,7 @@ The object definition settings for MODE are contained within the "fcst" and name = "APCP"; level = "A03"; } - + censor_thresh = []; censor_val = []; conv_radius = 60.0/grid_res; in grid squares @@ -3418,13 +3442,13 @@ match_flag The "match_flag" entry specifies the matching method to be applied: * NONE for no matching between forecast and observation objects - + * MERGE_BOTH for matching allowing additional merging in both fields. If two objects in one field match the same object in the other field, those two objects are merged. - + * MERGE_FCST for matching allowing only additional forecast merging - + * NO_MERGE for matching with no additional merging in either field .. code-block:: none @@ -3445,7 +3469,7 @@ skip unreasonable object comparisons. weight ^^^^^^ - + The weight variables control how much weight is assigned to each pairwise attribute when computing a total interest value for object pairs. The weights need not sum to any particular value but must be non-negative. When the @@ -3479,23 +3503,23 @@ mathematical functions. .. code-block:: none interest_function = { - + centroid_dist = ( ( 0.0, 1.0 ) ( 60.0/grid_res, 1.0 ) ( 600.0/grid_res, 0.0 ) ); - + boundary_dist = ( ( 0.0, 1.0 ) ( 400.0/grid_res, 0.0 ) ); - + convex_hull_dist = ( ( 0.0, 1.0 ) ( 400.0/grid_res, 0.0 ) ); - + angle_diff = ( ( 0.0, 1.0 ) ( 30.0, 1.0 ) @@ -3508,24 +3532,24 @@ mathematical functions. ( corner, 1.0 ) ( 1.0, 1.0 ) ); - + area_ratio = ratio_if; - + int_area_ratio = ( ( 0.00, 0.00 ) ( 0.10, 0.50 ) ( 0.25, 1.00 ) ( 1.00, 1.00 ) ); - + complexity_ratio = ratio_if; - + inten_perc_ratio = ratio_if; } total_interest_thresh ^^^^^^^^^^^^^^^^^^^^^ - + The total_interest_thresh variable should be set between 0 and 1. This threshold is applied to the total interest values computed for each pair of objects and is used in determining matches. @@ -3574,7 +3598,7 @@ lines in the grid. ct_stats_flag ^^^^^^^^^^^^^ - + The ct_stats_flag can be set to TRUE or FALSE to produce additional output, in the form of contingency table counts and statistics. @@ -3604,16 +3628,16 @@ The PB2NC tool filters out observations from PREPBUFR or BUFR files using the following criteria: (1) by message type: supply a list of PREPBUFR message types to retain - + (2) by station id: supply a list of observation stations to retain - + (3) by valid time: supply the beginning and ending time offset values in the obs_window entry described above. (4) by location: use the "mask" entry described below to supply either an NCEP masking grid, a masking lat/lon polygon or a file to a mask lat/lon polygon - + (5) by elevation: supply min/max elevation values (6) by report type: supply a list of report types to retain using @@ -3621,15 +3645,15 @@ following criteria: (7) by instrument type: supply a list of instrument type to retain - + (8) by vertical level: supply beg/end vertical levels using the level_range entry described below - + (9) by variable type: supply a list of observation variable types to retain using the obs_bufr_var entry described below - + (10) by quality mark: supply a quality mark threshold - + (11) Flag to retain values for all quality marks, or just the first quality mark (highest): use the event_stack_flag described below @@ -3637,24 +3661,24 @@ following criteria: retain. 0 - Surface level (mass reports only) - + 1 - Mandatory level (upper-air profile reports) - + 2 - Significant temperature level (upper-air profile reports) - + 2 - Significant temperature and winds-by-pressure level (future combined mass and wind upper-air reports) - + 3 - Winds-by-pressure level (upper-air profile reports) - + 4 - Winds-by-height level (upper-air profile reports) - + 5 - Tropopause level (upper-air profile reports) - + 6 - Reports on a single level (e.g., aircraft, satellite-wind, surface wind, precipitable water retrievals, etc.) - + 7 - Auxiliary levels generated via interpolation from spanning levels (upper-air profile reports) @@ -3665,14 +3689,14 @@ In the PB2NC tool, the "message_type" entry is an array of message types to be retained. An empty list indicates that all should be retained. | List of valid message types: -| “ADPUPA”, “AIRCAR”, “AIRCFT”, “ADPSFC”, “ERS1DA”, “GOESND”, “GPSIPW”, -| “MSONET”, “PROFLR”, “QKSWND”, “RASSDA”, “SATEMP”, +| “ADPUPA”, “AIRCAR”, “AIRCFT”, “ADPSFC”, “ERS1DA”, “GOESND”, “GPSIPW”, +| “MSONET”, “PROFLR”, “QKSWND”, “RASSDA”, “SATEMP”, | “SATWND”, “SFCBOG”, “SFCSHP”, “SPSSMI”, “SYNDAT”, “VADWND” For example: | message_type[] = [ "ADPUPA", "AIRCAR" ]; -| +| `Current Table A Entries in PREPBUFR mnemonic table `_ @@ -3782,12 +3806,12 @@ categories should be retained: | 1 = Mandatory level (upper-air profile reports) -| 2 = Significant temperature level (upper-air profile reports) +| 2 = Significant temperature level (upper-air profile reports) | 2 = Significant temperature and winds-by-pressure level (future combined mass -| and wind upper-air reports) +| and wind upper-air reports) -| 3 = Winds-by-pressure level (upper-air profile reports) +| 3 = Winds-by-pressure level (upper-air profile reports) | 4 = Winds-by-height level (upper-air profile reports) @@ -3799,7 +3823,7 @@ categories should be retained: | 7 = Auxiliary levels generated via interpolation from spanning levels | (upper-air profile reports) -| +| An empty list indicates that all should be retained. @@ -3870,7 +3894,7 @@ abbreviations to the output. quality_mark_thresh ^^^^^^^^^^^^^^^^^^^ - + The "quality_mark_thresh" entry specifies the maximum quality mark value to be retained. Observations with a quality mark LESS THAN OR EQUAL TO this threshold will be retained, while observations with a quality mark @@ -3959,12 +3983,12 @@ job to be performed. The format for an analysis job is as follows: | -job job_name | OPTIONAL ARGS -| +| Where "job_name" is set to one of the following: * "filter" - + To filter out the STAT lines matching the job filtering criteria specified below and using the optional arguments below. The output STAT lines are written to the file specified using the @@ -3983,7 +4007,7 @@ Where "job_name" is set to one of the following: | * "summary" - + To compute summary information for a set of statistics. The summary output includes the mean, standard deviation, percentiles (0th, 10th, 25th, 50th, 75th, 90th, and 100th), range, @@ -3993,10 +4017,10 @@ Where "job_name" is set to one of the following: logic: * simple arithmetic mean (default) - + * square root of the mean of the statistic squared (applied to columns listed in "wmo_sqrt_stats") - + * apply fisher transform (applied to columns listed in "wmo_fisher_stats") @@ -4004,9 +4028,9 @@ Where "job_name" is set to one of the following: The columns of data to be summarized are specified in one of two ways: - + * Specify the -line_type option once and specify one or more column names. - + * Format the -column option as LINE_TYPE:COLUMN. | @@ -4020,7 +4044,7 @@ Where "job_name" is set to one of the following: processing them separately. For TCStat, the "-column" argument may be set to: - + * "TRACK" for track, along-track, and cross-track errors. * "WIND" for all wind radius errors. * "TI" for track and maximum wind intensity errors. @@ -4046,7 +4070,7 @@ Where "job_name" is set to one of the following: To summarize multiple columns. * "aggregate" - + To aggregate the STAT data for the STAT line type specified using the "-line_type" argument. The output of the job will be in the same format as the input line type specified. The following line @@ -4058,13 +4082,13 @@ Where "job_name" is set to one of the following: SL1L2, SAL1L2, VL1L2, VAL1L2, PCT, NBRCNT, NBRCTC, GRAD, ISC, ECNT, RPS, RHIST, PHIST, RELP, SSVAR - + Required Args: -line_type | * "aggregate_stat" - + To aggregate the STAT data for the STAT line type specified using the "-line_type" argument. The output of the job will be the line type specified using the "-out_line_type" argument. The valid @@ -4156,11 +4180,11 @@ Where "job_name" is set to one of the following: Optionally, specify other filters for each term, -fcst_thresh. * "go_index" - + The GO Index is a special case of the skill score index consisting of a predefined set of variables, levels, lead times, statistics, and weights. - + For lead times of 12, 24, 36, and 48 hours, it contains RMSE for: .. code-block:: none @@ -4178,7 +4202,7 @@ Where "job_name" is set to one of the following: | * "ramp" - + The ramp job operates on a time-series of forecast and observed values and is analogous to the RIRW (Rapid Intensification and Weakening) job supported by the tc_stat tool. The amount of change @@ -4486,17 +4510,17 @@ wavelet decomposition should be performed: See: `Discrete Wavelet Transforms (DWT) initialization `_ * Valid combinations of the two are listed below: - + * HAAR for Haar wavelet (member = 2) - + * HAAR_CNTR for Centered-Haar wavelet (member = 2) * DAUB for Daubechies wavelet (member = 4, 6, 8, 10, 12, 14, 16, 18, 20) - + * DAUB_CNTR for Centered-Daubechies wavelet (member = 4, 6, 8, 10, 12, 14, 16, 18, 20) - + * BSPLINE for Bspline wavelet (member = 103, 105, 202, 204, 206, 208, 301, 303, 305, 307, 309) diff --git a/docs/Users_Guide/ensemble-stat.rst b/docs/Users_Guide/ensemble-stat.rst index 73e0b799be..8a2502c525 100644 --- a/docs/Users_Guide/ensemble-stat.rst +++ b/docs/Users_Guide/ensemble-stat.rst @@ -160,29 +160,30 @@ ____________________ .. code-block:: none - model = "FCST"; - desc = "NA"; - obtype = "ANALYS"; - regrid = { ... } - climo_mean = { ... } - climo_stdev = { ... } - climo_cdf = { ... } - obs_window = { beg = -5400; end = 5400; } - mask = { grid = [ "FULL" ]; poly = []; sid = []; } - ci_alpha = [ 0.05 ]; - interp = { field = BOTH; vld_thresh = 1.0; shape = SQUARE; - type = [ { method = NEAREST; width = 1; } ]; } - eclv_points = []; - sid_inc = []; - sid_exc = []; - duplicate_flag = NONE; + model = "FCST"; + desc = "NA"; + obtype = "ANALYS"; + regrid = { ... } + climo_mean = { ... } + climo_stdev = { ... } + climo_cdf = { ... } + obs_window = { beg = -5400; end = 5400; } + mask = { grid = [ "FULL" ]; poly = []; sid = []; } + ci_alpha = [ 0.05 ]; + interp = { field = BOTH; vld_thresh = 1.0; shape = SQUARE; + type = [ { method = NEAREST; width = 1; } ]; } + eclv_points = []; + sid_inc = []; + sid_exc = []; + duplicate_flag = NONE; obs_quality_inc = []; obs_quality_exc = []; - obs_summary = NONE; - obs_perc_value = 50; + obs_summary = NONE; + obs_perc_value = 50; message_type_group_map = [...]; - output_prefix = ""; - version = "VN.N"; + grid_weight_flag = NONE; + output_prefix = ""; + version = "VN.N"; The configuration options listed above are common to many MET tools and are described in :numref:`config_options`. diff --git a/docs/Users_Guide/grid-stat.rst b/docs/Users_Guide/grid-stat.rst index b10b1b3431..631afbdaf2 100644 --- a/docs/Users_Guide/grid-stat.rst +++ b/docs/Users_Guide/grid-stat.rst @@ -241,31 +241,32 @@ __________________________ .. code-block:: none - model = "FCST"; - desc = "NA"; - obtype = "ANALYS"; - fcst = { ... } - obs = { ... } - regrid = { ... } - climo_mean = { ... } - climo_stdev = { ... } - climo_cdf = { ... } - mask = { grid = [ "FULL" ]; poly = []; } - ci_alpha = [ 0.05 ]; - boot = { interval = PCTILE; rep_prop = 1.0; n_rep = 1000; - rng = "mt19937"; seed = ""; } - interp = { field = BOTH; vld_thresh = 1.0; shape = SQUARE; - type = [ { method = NEAREST; width = 1; } ]; } - censor_thresh = []; - censor_val = []; - mpr_column = []; - mpr_thresh = []; - eclv_points = 0.05; - hss_ec_value = NA; - rank_corr_flag = TRUE; - tmp_dir = "/tmp"; - output_prefix = ""; - version = "VN.N"; + model = "FCST"; + desc = "NA"; + obtype = "ANALYS"; + fcst = { ... } + obs = { ... } + regrid = { ... } + climo_mean = { ... } + climo_stdev = { ... } + climo_cdf = { ... } + mask = { grid = [ "FULL" ]; poly = []; } + ci_alpha = [ 0.05 ]; + boot = { interval = PCTILE; rep_prop = 1.0; n_rep = 1000; + rng = "mt19937"; seed = ""; } + interp = { field = BOTH; vld_thresh = 1.0; shape = SQUARE; + type = [ { method = NEAREST; width = 1; } ]; } + censor_thresh = []; + censor_val = []; + mpr_column = []; + mpr_thresh = []; + eclv_points = 0.05; + hss_ec_value = NA; + rank_corr_flag = TRUE; + grid_weight_flag = NONE; + tmp_dir = "/tmp"; + output_prefix = ""; + version = "VN.N"; The configuration options listed above are common to multiple MET tools and are described in :numref:`config_options`. diff --git a/internal/test_unit/config/EnsembleStatConfig_grid_weight b/internal/test_unit/config/EnsembleStatConfig_grid_weight index 9915c3fa37..12994a3a5b 100644 --- a/internal/test_unit/config/EnsembleStatConfig_grid_weight +++ b/internal/test_unit/config/EnsembleStatConfig_grid_weight @@ -15,7 +15,7 @@ model = "FCST"; // Output description to be written // May be set separately in each "obs.field" entry // -desc = "NA"; +desc = "${DESC}"; // // Output observation type to be written @@ -62,7 +62,7 @@ prob_pct_thresh = [ ==0.25 ]; nc_var_str = ""; eclv_points = 0.05; -tmp_field = [ { name = "TMP"; level = [ "Z2" ]; } ]; +tmp_field = [ { name = "TMP"; level = [ "Z2" ]; prob_cat_thresh = [ <=273, >273 ]; } ]; // // Forecast and observation fields to be verified @@ -139,6 +139,11 @@ climo_mean = { hour_interval = 6; } +climo_stdev = climo_mean; +climo_stdev = { + file_name = [ "${CLIMO_STDEV_FILE}" ]; +} + //////////////////////////////////////////////////////////////////////////////// // @@ -200,11 +205,11 @@ output_flag = { orank = NONE; ssvar = STAT; relp = STAT; - pct = NONE; - pstd = NONE; - pjc = NONE; - prc = NONE; - eclv = NONE; + pct = STAT; + pstd = STAT; + pjc = STAT; + prc = STAT; + eclv = STAT; } //////////////////////////////////////////////////////////////////////////////// diff --git a/internal/test_unit/config/GridStatConfig_grid_weight b/internal/test_unit/config/GridStatConfig_grid_weight index 1efce2f152..27b266dfa8 100644 --- a/internal/test_unit/config/GridStatConfig_grid_weight +++ b/internal/test_unit/config/GridStatConfig_grid_weight @@ -15,7 +15,7 @@ model = "GFS"; // Output description to be written // May be set separately in each "obs.field" entry // -desc = "NA"; +desc = "${DESC}"; // // Output observation type to be written @@ -54,7 +54,7 @@ nc_pairs_var_suffix = ""; hss_ec_value = NA; rank_corr_flag = FALSE; -tmp_field = [ { name = "TMP"; level = [ "P500" ]; } ]; +tmp_field = [ { name = "TMP"; level = [ "P500" ]; cat_thresh = [ >245, >255 ]; } ]; // // Forecast and observation fields to be verified @@ -179,11 +179,11 @@ distance_map = { // Statistical output types // output_flag = { - fho = NONE; - ctc = NONE; - cts = NONE; - mctc = NONE; - mcts = NONE; + fho = NONE; + ctc = STAT; + cts = STAT; + mctc = STAT; + mcts = STAT; cnt = STAT; sl1l2 = STAT; sal1l2 = STAT; diff --git a/internal/test_unit/t b/internal/test_unit/t new file mode 100755 index 0000000000..8df021c329 --- /dev/null +++ b/internal/test_unit/t @@ -0,0 +1,114 @@ +export 'CLIMO_MEAN_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cmean_1d.19790410' +export 'DESC=NO_WEIGHT' +export 'GRID_WEIGHT=NONE' +export 'OUTPUT_PREFIX=NO_WEIGHT' +/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/grid_stat \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib2/gfs/gfs_2012040900_F024.grib2 \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib2/gfsanl/gfsanl_4_20120410_0000_000.grb2 \ + /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/config/GridStatConfig_grid_weight \ + -outdir /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/grid_weight -v 1 +unset CLIMO_MEAN_FILE +unset DESC +unset GRID_WEIGHT +unset OUTPUT_PREFIX + + +export 'CLIMO_MEAN_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cmean_1d.19790410' +export 'DESC=COS_LAT_WEIGHT' +export 'GRID_WEIGHT=COS_LAT' +export 'OUTPUT_PREFIX=COS_LAT_WEIGHT' +/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/grid_stat \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib2/gfs/gfs_2012040900_F024.grib2 \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib2/gfsanl/gfsanl_4_20120410_0000_000.grb2 \ + /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/config/GridStatConfig_grid_weight \ + -outdir /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/grid_weight -v 1 +unset CLIMO_MEAN_FILE +unset DESC +unset GRID_WEIGHT +unset OUTPUT_PREFIX + + +export 'CLIMO_MEAN_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cmean_1d.19790410' +export 'DESC=AREA_WEIGHT' +export 'GRID_WEIGHT=AREA' +export 'OUTPUT_PREFIX=AREA_WEIGHT' +/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/grid_stat \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib2/gfs/gfs_2012040900_F024.grib2 \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib2/gfsanl/gfsanl_4_20120410_0000_000.grb2 \ + /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/config/GridStatConfig_grid_weight \ + -outdir /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/grid_weight -v 1 +unset CLIMO_MEAN_FILE +unset DESC +unset GRID_WEIGHT +unset OUTPUT_PREFIX + + +export 'CLIMO_MEAN_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cmean_1d.19790410' +export 'CLIMO_STDEV_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cstdv_1d.19790410' +export 'DESC=NO_WEIGHT' +export 'GRID_WEIGHT=NONE' +export 'OUTPUT_PREFIX=NO_WEIGHT' +/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/ensemble_stat \ + 6 \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-fer-gep1/arw-fer-gep1_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-fer-gep5/arw-fer-gep5_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-sch-gep2/arw-sch-gep2_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-sch-gep6/arw-sch-gep6_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-tom-gep3/arw-tom-gep3_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-tom-gep7/arw-tom-gep7_2012040912_F024.grib \ + /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/config/EnsembleStatConfig_grid_weight \ + -grid_obs /d1/projects/MET/MET_test_data/unit_test/obs_data/laps/laps_2012041012_F000.grib \ + -outdir /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/grid_weight -v 1 +unset CLIMO_MEAN_FILE +unset CLIMO_STDEV_FILE +unset DESC +unset GRID_WEIGHT +unset OUTPUT_PREFIX + + +export 'CLIMO_MEAN_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cmean_1d.19790410' +export 'CLIMO_STDEV_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cstdv_1d.19790410' +export 'DESC=COS_LAT_WEIGHT' +export 'GRID_WEIGHT=COS_LAT' +export 'OUTPUT_PREFIX=COS_LAT_WEIGHT' +/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/ensemble_stat \ + 6 \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-fer-gep1/arw-fer-gep1_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-fer-gep5/arw-fer-gep5_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-sch-gep2/arw-sch-gep2_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-sch-gep6/arw-sch-gep6_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-tom-gep3/arw-tom-gep3_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-tom-gep7/arw-tom-gep7_2012040912_F024.grib \ + /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/config/EnsembleStatConfig_grid_weight \ + -grid_obs /d1/projects/MET/MET_test_data/unit_test/obs_data/laps/laps_2012041012_F000.grib \ + -outdir /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/grid_weight -v 1 +unset CLIMO_MEAN_FILE +unset CLIMO_STDEV_FILE +unset DESC +unset GRID_WEIGHT +unset OUTPUT_PREFIX + + +export 'CLIMO_MEAN_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cmean_1d.19790410' +export 'CLIMO_STDEV_FILE=${MET_TEST_INPUT}/climatology_data/NCEP_1.0deg/cstdv_1d.19790410' +export 'DESC=AREA_WEIGHT' +export 'GRID_WEIGHT=AREA' +export 'OUTPUT_PREFIX=AREA_WEIGHT' +/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/ensemble_stat \ + 6 \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-fer-gep1/arw-fer-gep1_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-fer-gep5/arw-fer-gep5_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-sch-gep2/arw-sch-gep2_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-sch-gep6/arw-sch-gep6_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-tom-gep3/arw-tom-gep3_2012040912_F024.grib \ + /d1/projects/MET/MET_test_data/unit_test/model_data/grib1/arw-tom-gep7/arw-tom-gep7_2012040912_F024.grib \ + /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/config/EnsembleStatConfig_grid_weight \ + -grid_obs /d1/projects/MET/MET_test_data/unit_test/obs_data/laps/laps_2012041012_F000.grib \ + -outdir /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/grid_weight -v 1 +unset CLIMO_MEAN_FILE +unset CLIMO_STDEV_FILE +unset DESC +unset GRID_WEIGHT +unset OUTPUT_PREFIX + + diff --git a/internal/test_unit/unit_test.log b/internal/test_unit/unit_test.log new file mode 100644 index 0000000000..ef1c7b19b5 --- /dev/null +++ b/internal/test_unit/unit_test.log @@ -0,0 +1,1225 @@ +export MET_BASE=/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met +export MET_BUILD_BASE=/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../.. +export MET_TEST_BASE=/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit +export MET_TEST_INPUT=/d1/projects/MET/MET_test_data/unit_test +export MET_TEST_OUTPUT=/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ascii2nc.xml + +TEST: ascii2nc_TRMM_3hr + - pass - 17.767 sec +TEST: ascii2nc_GAGE_24hr + - pass - 1.165 sec +TEST: ascii2nc_GAGE_24hr_badfile + - pass - 0.53 sec +TEST: ascii2nc_duplicates + - pass - 0.532 sec +TEST: ascii2nc_SURFRAD1 + - pass - 0.975 sec +TEST: ascii2nc_insitu_turb + - pass - 3.119 sec +TEST: ascii2nc_by_var_name_PB + - pass - 146.208 sec +TEST: ascii2nc_rain_01H_sum + - pass - 0.582 sec +TEST: ascii2nc_airnow_daily_v2 + - pass - 0.799 sec +TEST: ascii2nc_airnow_hourly_aqobs + - pass - 0.886 sec +TEST: ascii2nc_airnow_hourly + - pass - 3.847 sec +TEST: ascii2nc_ndbc + - pass - 8.243 sec +TEST: ascii2nc_ismn_SNOTEL + - pass - 14.834 sec +TEST: ascii2nc_iabp + - pass - 0.542 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ascii2nc_indy.xml + +TEST: ascii2nc_TRMM_12hr + - pass - 20.969 sec +TEST: ascii2nc_LITTLE_R + - pass - 0.605 sec +TEST: ascii2nc_LITTLE_R_BAD_RECORD + - pass - 0.551 sec +TEST: ascii2nc_SURFRAD + - pass - 0.976 sec +TEST: ascii2nc_SURFRAD_summary1 + - pass - 2.007 sec +TEST: ascii2nc_SURFRAD_summary2 + - pass - 1.483 sec +TEST: ascii2nc_SURFRAD_summary3 + - pass - 1.224 sec +TEST: ascii2nc_SURFRAD_summary4 + - pass - 1.205 sec +TEST: ascii2nc_insitu_turb_mask_sid + - pass - 1.241 sec +TEST: ascii2nc_insitu_turb_mask_grid_data + - pass - 2.994 sec +TEST: ascii2nc_insitu_turb_mask_named_grid + - pass - 2.963 sec +TEST: ascii2nc_MASK_GRID + - pass - 3.539 sec +TEST: ascii2nc_MASK_POLY + - pass - 1.298 sec +TEST: ascii2nc_WWSIS_clear_pvwatts_one_min + - pass - 18.383 sec +TEST: ascii2nc_WWSIS_clear_pvwatts_five_min + - pass - 2.641 sec +TEST: ascii2nc_WWSIS_clear_pvwatts_ten_min + - pass - 1.509 sec +TEST: ascii2nc_WWSIS_clear_pvwatts_sixty_min + - pass - 0.694 sec +TEST: ascii2nc_WWSIS_HA_pvwatts_sixty_min + - pass - 0.735 sec +TEST: ascii2nc_WWSIS_pvwatts_one_min + - pass - 18.646 sec +TEST: ascii2nc_WWSIS_pvwatts_sixty_min + - pass - 0.7 sec +TEST: ascii2nc_by_var_name + - pass - 0.534 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_madis2nc.xml + +TEST: madis2nc_METAR + - pass - 12.385 sec +TEST: madis2nc_METAR_time_summary + - pass - 16.391 sec +TEST: madis2nc_METAR_mask_sid + - pass - 0.608 sec +TEST: madis2nc_METAR_mask_grid + - pass - 1.052 sec +TEST: madis2nc_RAOB + - pass - 3.295 sec +TEST: madis2nc_PROFILER_MASK_POLY + - pass - 0.57 sec +TEST: madis2nc_MARITIME + - pass - 0.811 sec +TEST: madis2nc_MESONET_MASK_GRID + - pass - 6.653 sec +TEST: madis2nc_MESONET_optional_vars + - pass - 4.856 sec +TEST: madis2nc_ACARS_PROFILES + - pass - 2.095 sec +TEST: madis2nc_buf_handle + - pass - 2.626 sec +TEST: madis2nc_multiple_inputs + - pass - 2.167 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_trmm2nc.xml + +TEST: trmm2nc_3hr + - pass - 0.334 sec +TEST: trmm2nc_12hr + - pass - 0.331 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_pb2nc.xml + +TEST: pb2nc_GDAS_mask_grid_G212 + - pass - 8.247 sec +TEST: pb2nc_NDAS_no_mask + - pass - 9.265 sec +TEST: pb2nc_NDAS_mask_poly_conus + - pass - 3.676 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_pb2nc_indy.xml + +TEST: pb2nc_NDAS_mask_sid_list + - pass - 1.602 sec +TEST: pb2nc_NDAS_mask_sid_file + - pass - 1.765 sec +TEST: pb2nc_NDAS_mask_grid_data_cfg + - pass - 4.469 sec +TEST: pb2nc_compute_pbl_cape + - pass - 13.715 sec +TEST: pb2nc_NDAS_var_all + - pass - 19.439 sec +TEST: pb2nc_vertical_level_500 + - pass - 1.392 sec +TEST: pb2nc_NDAS_summary + - pass - 6.866 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_gen_vx_mask.xml + +TEST: gen_vx_mask_POLY_GFS_LATLON + - pass - 11.973 sec +TEST: gen_vx_mask_POLY_GFS_MERCATOR + - pass - 0.607 sec +TEST: gen_vx_mask_POLY_NAM_LAMBERT + - pass - 4.092 sec +TEST: gen_vx_mask_POLY_HMT_STEREO + - pass - 1.026 sec +TEST: gen_vx_mask_POLY_GFS_LATLON_NAK + - pass - 1.476 sec +TEST: gen_vx_mask_POLY_LATLON_RECTANGLE + - pass - 0.552 sec +TEST: gen_vx_mask_POLY_XY_RECTANGLE + - pass - 0.544 sec +TEST: gen_vx_mask_GRID_NAM_HMT_STEREO + - pass - 0.732 sec +TEST: gen_vx_mask_GRID_NAMED_GRIDS + - pass - 0.618 sec +TEST: gen_vx_mask_GRID_SPEC_STRINGS + - pass - 0.598 sec +TEST: gen_vx_mask_CIRCLE + - pass - 0.685 sec +TEST: gen_vx_mask_CIRCLE_MASK + - pass - 0.642 sec +TEST: gen_vx_mask_CIRCLE_COMPLEMENT + - pass - 0.649 sec +TEST: gen_vx_mask_TRACK + - pass - 1.096 sec +TEST: gen_vx_mask_TRACK_MASK + - pass - 1.098 sec +TEST: gen_vx_mask_DATA_APCP_24 + - pass - 1.067 sec +TEST: gen_vx_mask_POLY_PASS_THRU + - pass - 0.633 sec +TEST: gen_vx_mask_POLY_INTERSECTION + - pass - 0.633 sec +TEST: gen_vx_mask_POLY_UNION + - pass - 0.634 sec +TEST: gen_vx_mask_POLY_SYMDIFF + - pass - 0.627 sec +TEST: gen_vx_mask_DATA_INPUT_FIELD + - pass - 1.211 sec +TEST: gen_vx_mask_BOX + - pass - 0.536 sec +TEST: gen_vx_mask_SOLAR_ALT + - pass - 0.557 sec +TEST: gen_vx_mask_SOLAR_AZI + - pass - 0.673 sec +TEST: gen_vx_mask_LAT + - pass - 0.543 sec +TEST: gen_vx_mask_LON + - pass - 0.548 sec +TEST: gen_vx_mask_SHAPE + - pass - 0.549 sec +TEST: gen_vx_mask_SHAPE_STR + - pass - 0.695 sec +TEST: gen_vx_mask_SHAPE_STR_MULTI + - pass - 0.605 sec +TEST: gen_vx_mask_PYTHON + - pass - 1.605 sec +TEST: gen_vx_mask_DATA_TWO_FILE_TYPES + - pass - 1.22 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_gen_ens_prod.xml + +TEST: gen_ens_prod_NO_CTRL + - pass - 9.428 sec +TEST: gen_ens_prod_WITH_CTRL + - pass - 8.858 sec +TEST: gen_ens_prod_SINGLE_FILE_NC_NO_CTRL + - pass - 1.23 sec +TEST: gen_ens_prod_SINGLE_FILE_NC_WITH_CTRL + - pass - 1.144 sec +TEST: gen_ens_prod_SINGLE_FILE_GRIB_NO_CTRL + - pass - 1.206 sec +TEST: gen_ens_prod_SINGLE_FILE_GRIB_WITH_CTRL + - pass - 1.213 sec +TEST: gen_ens_prod_NORMALIZE + - pass - 6.058 sec +TEST: gen_ens_prod_CLIMO_ANOM_ENS_MEMBER_ID + - pass - 0.88 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_pcp_combine.xml + +TEST: pcp_combine_sum_GRIB1 + - pass - 28.664 sec +TEST: pcp_combine_sum_GRIB1_MISSING + - pass - 14.362 sec +TEST: pcp_combine_sum_GRIB1_MULTIPLE_FIELDS + - pass - 42.993 sec +TEST: pcp_combine_sum_GRIB2 + - pass - 1.445 sec +TEST: pcp_combine_add_GRIB1 + - pass - 1.686 sec +TEST: pcp_combine_add_GRIB2 + - pass - 0.601 sec +TEST: pcp_combine_add_STAGEIV + - pass - 1.207 sec +TEST: pcp_combine_add_ACCUMS + - pass - 1.414 sec +TEST: pcp_combine_sub_GRIB1 + - pass - 1.089 sec +TEST: pcp_combine_sub_GRIB1_run2 + - pass - 0.675 sec +TEST: pcp_combine_sub_GRIB2 + - pass - 0.553 sec +TEST: pcp_combine_sub_NC_MET_06 + - pass - 0.631 sec +TEST: pcp_combine_sub_P_INTERP + - pass - 0.811 sec +TEST: pcp_combine_add_VARNAME + - pass - 0.891 sec +TEST: pcp_combine_sub_DIFFERENT_INIT + - pass - 0.669 sec +TEST: pcp_combine_sub_NEGATIVE_ACCUM + - pass - 0.694 sec +TEST: pcp_combine_sub_SUBTRACT_MULTIPLE_FIELDS + - pass - 1.045 sec +TEST: pcp_combine_derive_LIST_OF_FILES + - pass - 1.104 sec +TEST: pcp_combine_derive_MULTIPLE_FIELDS + - pass - 3.137 sec +TEST: pcp_combine_derive_VLD_THRESH + - pass - 1.208 sec +TEST: pcp_combine_derive_CUSTOM_NAMES + - pass - 0.747 sec +TEST: pcp_combine_sub_ROT_LL + - pass - 1.096 sec +TEST: pcp_combine_LAEA_GRIB2 + - pass - 1.428 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_wwmca_regrid.xml + +TEST: wwmca_regrid_G003_NO_AGE + - pass - 2.472 sec +TEST: wwmca_regrid_G003_AGE_60 + - pass - 1.532 sec +TEST: wwmca_regrid_G003_AGE_120 + - pass - 1.531 sec +TEST: wwmca_regrid_G003_AGE_240 + - pass - 1.575 sec +TEST: wwmca_regrid_G003_WRITE_PIXEL_AGE + - pass - 1.547 sec +TEST: wwmca_regrid_GFS_LATLON + - pass - 4.694 sec +TEST: wwmca_regrid_GFS_MERCATOR + - pass - 0.699 sec +TEST: wwmca_regrid_NAM_LAMBERT + - pass - 5.288 sec +TEST: wwmca_regrid_HMT_STEREO + - pass - 0.963 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_point_stat.xml + +TEST: point_stat_GRIB1_NAM_GDAS + - pass - 49.029 sec +TEST: point_stat_GRIB1_NAM_GDAS_WINDS + - pass - 11.773 sec +TEST: point_stat_GRIB1_NAM_GDAS_MASK_SID + - pass - 43.827 sec +TEST: point_stat_GRIB2_NAM_NDAS + - pass - 36.403 sec +TEST: point_stat_GRIB2_SREF_GDAS + - pass - 28.574 sec +TEST: point_stat_GRIB1_NAM_TRMM + - pass - 15.594 sec +TEST: point_stat_GRIB2_SREF_TRMM + - pass - 15.265 sec +TEST: point_stat_NCMET_NAM_HMTGAGE + - pass - 1.756 sec +TEST: point_stat_NCMET_NAM_NDAS_SEEPS + - pass - 9.972 sec +TEST: point_stat_NCPINT_TRMM + - pass - 15.005 sec +TEST: point_stat_NCPINT_NDAS + - pass - 7.636 sec +TEST: point_stat_GRIB2_SREF_TRMM_prob + - pass - 2.426 sec +TEST: point_stat_GTG_lc + - pass - 60.144 sec +TEST: point_stat_GTG_latlon + - pass - 43.921 sec +TEST: point_stat_SID_INC_EXC + - pass - 6.573 sec +TEST: point_stat_SID_INC_EXC_CENSOR + - pass - 7.393 sec +TEST: point_stat_GRIB1_NAM_GDAS_INTERP_OPTS + - pass - 5.202 sec +TEST: point_stat_GRIB1_NAM_GDAS_INTERP_OPTS_name + - pass - 20.228 sec +TEST: point_stat_LAND_TOPO_MASK + - pass - 36.319 sec +TEST: point_stat_MPR_THRESH + - pass - 57.575 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_stat_analysis_ps.xml + +TEST: stat_analysis_CONFIG_POINT_STAT + - pass - 44.426 sec +TEST: stat_analysis_POINT_STAT_SUMMARY + - pass - 34.47 sec +TEST: stat_analysis_POINT_STAT_SUMMARY_UNION + - pass - 19.293 sec +TEST: stat_analysis_POINT_STAT_FILTER_OBS_SID + - pass - 1.695 sec +TEST: stat_analysis_POINT_STAT_FILTER_TIMES + - pass - 8.536 sec +TEST: stat_analysis_POINT_STAT_SEEPS + - pass - 3.168 sec +TEST: stat_analysis_RAMPS + - pass - 3.089 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_duplicate_flag.xml + +TEST: point_stat_DUP_NONE + - pass - 0.694 sec +TEST: point_stat_DUP_UNIQUE + - pass - 0.679 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_obs_summary.xml + +TEST: ascii2nc_obs_summary + - pass - 0.571 sec +TEST: point_stat_OS_NONE + - pass - 0.684 sec +TEST: point_stat_OS_NEAREST + - pass - 0.728 sec +TEST: point_stat_OS_MIN + - pass - 0.676 sec +TEST: point_stat_OS_MAX + - pass - 0.686 sec +TEST: point_stat_OS_UW_MEAN + - pass - 0.681 sec +TEST: point_stat_OS_DW_MEAN + - pass - 0.676 sec +TEST: point_stat_OS_MEDIAN + - pass - 0.677 sec +TEST: point_stat_OS_PERC + - pass - 0.682 sec +TEST: point_stat_OS_UNIQUE_ALL + - pass - 1.406 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_grid_stat.xml + +TEST: grid_stat_GRIB_lvl_typ_val + - pass - 102.432 sec +TEST: grid_stat_GRIB_set_attr + - pass - 26.404 sec +TEST: grid_stat_GRIB2_NAM_RTMA + - pass - 21.167 sec +TEST: grid_stat_GRIB2_NAM_RTMA_NP2 + - pass - 19.316 sec +TEST: grid_stat_GRIB1_NAM_STAGE4 + - pass - 35.211 sec +TEST: grid_stat_GRIB1_NAM_STAGE4_CENSOR + - pass - 3.779 sec +TEST: grid_stat_GTG_lc + - pass - 2.261 sec +TEST: grid_stat_GTG_latlon + - pass - 3.405 sec +TEST: grid_stat_GRIB2_SREF_STAGE4_prob_as_scalar + - pass - 2.357 sec +TEST: grid_stat_APPLY_MASK_TRUE + - pass - 5.515 sec +TEST: grid_stat_APPLY_MASK_FALSE + - pass - 5.392 sec +TEST: grid_stat_GFS_FOURIER + - pass - 8.642 sec +TEST: grid_stat_MPR_THRESH + - pass - 59.558 sec +TEST: grid_stat_UK_SEEPS + - pass - 4.344 sec +TEST: grid_stat_WRF_pres + - pass - 1.113 sec +TEST: grid_stat_GEN_ENS_PROD + - pass - 3.09 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_stat_analysis_gs.xml + +TEST: stat_analysis_CONFIG_GRID_STAT + - pass - 0.878 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_wavelet_stat.xml + +TEST: wavelet_stat_GRIB1_NAM_STAGE4 + - pass - 27.062 sec +TEST: wavelet_stat_GRIB1_NAM_STAGE4_NO_THRESH + - pass - 15.383 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_stat_analysis_ws.xml + +TEST: stat_analysis_AGG_ISC + - pass - 0.64 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ensemble_stat.xml + +TEST: ensemble_stat_CMD_LINE + - pass - 5.766 sec +TEST: ensemble_stat_FILE_LIST + - pass - 5.425 sec +TEST: ensemble_stat_MASK_SID + - pass - 1.392 sec +TEST: ensemble_stat_MASK_SID_CTRL + - pass - 1.345 sec +TEST: ensemble_stat_MASK_SID_CENSOR + - pass - 1.732 sec +TEST: ensemble_stat_SKIP_CONST + - pass - 5.18 sec +TEST: ensemble_stat_OBSERR + - pass - 13.099 sec +TEST: ensemble_stat_SINGLE_FILE_NC_NO_CTRL + - pass - 3.044 sec +TEST: ensemble_stat_SINGLE_FILE_NC_WITH_CTRL + - pass - 3.158 sec +TEST: ensemble_stat_SINGLE_FILE_GRIB_NO_CTRL + - pass - 2.072 sec +TEST: ensemble_stat_SINGLE_FILE_GRIB_WITH_CTRL + - pass - 2.109 sec +TEST: ensemble_stat_RPS_CLIMO_BIN_PROB + - pass - 0.581 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_stat_analysis_es.xml + +TEST: stat_analysis_AGG_RHIST + - pass - 0.843 sec +TEST: stat_analysis_AGG_PHIST + - pass - 0.843 sec +TEST: stat_analysis_AGG_RELP + - pass - 1.007 sec +TEST: stat_analysis_AGG_ECNT + - pass - 1.031 sec +TEST: stat_analysis_AGG_STAT_ORANK_RHIST_PHIST + - pass - 5.348 sec +TEST: stat_analysis_AGG_STAT_ORANK_RELP + - pass - 4.691 sec +TEST: stat_analysis_AGG_STAT_ORANK_SSVAR + - pass - 5.3 sec +TEST: stat_analysis_AGG_STAT_ORANK_ECNT + - pass - 11.662 sec +TEST: stat_analysis_AGG_SSVAR + - pass - 1.132 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_mode.xml + +TEST: mode_NO_MATCH_MERGE + - pass - 2.319 sec +TEST: mode_NO_MERGE + - pass - 1.94 sec +TEST: mode_MERGE_BOTH + - pass - 3.724 sec +TEST: mode_MASK_POLY + - pass - 1.96 sec +TEST: mode_QUILT + - pass - 5.595 sec +TEST: mode_CONFIG_MERGE + - pass - 3.549 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_mode_multivar.xml + +TEST: mode_multivar_snow + - pass - 36.544 sec +TEST: mode_multivar_snow_3_2 + - pass - 19.229 sec +TEST: mode_multivar_snow_super + - pass - 31.774 sec +TEST: mode_multivar_FAKE_DATA + - pass - 4.152 sec +TEST: mode_multivar_FAKE_DATA_with_intensities + - pass - 6.589 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_mode_analysis.xml + +TEST: mode_analysis_BYCASE_SIMPLE + - pass - 0.758 sec +TEST: mode_analysis_BYCASE_CLUSTER + - pass - 0.55 sec +TEST: mode_analysis_MET-644_LOOKIN_BY_DIR + - pass - 0.593 sec +TEST: mode_analysis_MET-644_LOOKIN_BY_FILE + - pass - 0.553 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_plot_point_obs.xml + +TEST: plot_point_obs_G218 + - pass - 4.803 sec +TEST: plot_point_obs_TMP_ADPUPA + - pass - 4.345 sec +TEST: plot_point_obs_CONFIG + - pass - 4.309 sec +TEST: plot_point_obs_CONFIG_REGRID + - pass - 4.069 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_plot_data_plane.xml + +TEST: plot_data_plane_GRIB1 + - pass - 1.385 sec +TEST: plot_data_plane_GRIB1_REC + - pass - 0.934 sec +TEST: plot_data_plane_GRIB1_CODE + - pass - 0.916 sec +TEST: plot_data_plane_GRIB1_ENS + - pass - 0.825 sec +TEST: plot_data_plane_GRIB1_ENS_HI + - pass - 0.793 sec +TEST: plot_data_plane_GRIB1_rotlatlon + - pass - 0.721 sec +TEST: plot_data_plane_GRIB2 + - pass - 0.858 sec +TEST: plot_data_plane_GRIB2_ENS + - pass - 0.715 sec +TEST: plot_data_plane_GRIB2_ENS_LOW + - pass - 0.725 sec +TEST: plot_data_plane_GRIB2_PROB + - pass - 0.645 sec +TEST: plot_data_plane_NC_PINTERP + - pass - 0.755 sec +TEST: plot_data_plane_NC_MET + - pass - 0.804 sec +TEST: plot_data_plane_NCCF_lc_0 + - pass - 0.746 sec +TEST: plot_data_plane_NCCF_lc_25 + - pass - 0.815 sec +TEST: plot_data_plane_NCCF_lc_50 + - pass - 0.802 sec +TEST: plot_data_plane_NCCF_latlon_0 + - pass - 1.003 sec +TEST: plot_data_plane_NCCF_latlon_12 + - pass - 0.976 sec +TEST: plot_data_plane_NCCF_latlon_25 + - pass - 0.976 sec +TEST: plot_data_plane_NCCF_latlon_by_value + - pass - 0.956 sec +TEST: plot_data_plane_NCCF_north_to_south + - pass - 4.35 sec +TEST: plot_data_plane_NCCF_time + - pass - 0.681 sec +TEST: plot_data_plane_NCCF_time_int64 + - pass - 0.73 sec +TEST: plot_data_plane_NCCF_rotlatlon + - pass - 0.934 sec +TEST: plot_data_plane_TRMM_3B42_3hourly_nc + - pass - 1.041 sec +TEST: plot_data_plane_TRMM_3B42_daily_nc + - pass - 1.233 sec +TEST: plot_data_plane_TRMM_3B42_daily_packed + - pass - 1.209 sec +TEST: plot_data_plane_TRMM_3B42_daily_packed_CONVERT + - pass - 1.71 sec +TEST: plot_data_plane_EaSM_CMIP5_rcp85 + - pass - 0.752 sec +TEST: plot_data_plane_EaSM_CMIP5_rcp85_time_slice + - pass - 0.76 sec +TEST: plot_data_plane_EaSM_CESM + - pass - 0.771 sec +TEST: plot_data_plane_GRIB2_NBM_CWASP_L0 + - pass - 3.072 sec +TEST: plot_data_plane_GRIB2_NBM_CWASP_PERC_5 + - pass - 3.146 sec +TEST: plot_data_plane_GRIB2_NBM_CWASP_PROB_50 + - pass - 2.262 sec +TEST: plot_data_plane_GRIB2_NBM_WETBT_MIXED_LEVELS + - pass - 3.377 sec +TEST: plot_data_plane_GRIB2_NBM_FICEAC_A48_PERC_10 + - pass - 2.102 sec +TEST: plot_data_plane_LAEA_GRIB2 + - pass - 1.631 sec +TEST: plot_data_plane_LAEA_NCCF + - pass - 1.595 sec +TEST: plot_data_plane_LAEA_MET_NC + - pass - 1.634 sec +TEST: plot_data_plane_NCCF_POLAR_STEREO + - pass - 1.747 sec +TEST: plot_data_plane_NCCF_POLAR_ELLIPSOIDAL + - pass - 0.719 sec +TEST: plot_data_plane_GRIB2_TABLE_4.48 + - pass - 1.514 sec +TEST: plot_data_plane_WRF_west_east_stag + - pass - 0.864 sec +TEST: plot_data_plane_WRF_south_north_stag + - pass - 0.863 sec +TEST: plot_data_plane_WRF_num_press_levels_stag + - pass - 0.761 sec +TEST: plot_data_plane_WRF_num_z_levels_stag + - pass - 0.741 sec +TEST: plot_data_plane_WRF_bottom_top + - pass - 0.866 sec +TEST: plot_data_plane_WRF_bottom_top_stag + - pass - 0.852 sec +TEST: plot_data_plane_set_attr_grid + - pass - 14.222 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_wwmca_plot.xml + +TEST: wwmca_plot_NH_SH_AGE_240 + - pass - 2.607 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_series_analysis.xml + +TEST: series_analysis_CMD_LINE + - pass - 8.969 sec +TEST: series_analysis_AGGR_CMD_LINE + - pass - 10.585 sec +TEST: series_analysis_FILE_LIST + - pass - 6.317 sec +TEST: series_analysis_AGGR_FILE_LIST + - pass - 7.913 sec +TEST: series_analysis_UPPER_AIR + - pass - 3.876 sec +TEST: series_analysis_CONDITIONAL + - pass - 3.963 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_tc_dland.xml + +TEST: tc_dland_ONE_DEG + - pass - 17.215 sec +TEST: tc_dland_HALF_DEG + - pass - 65.738 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_tc_pairs.xml + +TEST: tc_pairs_ALAL2010 + - pass - 2.412 sec +TEST: tc_pairs_CONSENSUS + - pass - 5.688 sec +TEST: tc_pairs_INTERP12_FILL + - pass - 0.923 sec +TEST: tc_pairs_INTERP12_REPLACE + - pass - 0.938 sec +TEST: tc_pairs_PROBRIRW + - pass - 2.656 sec +TEST: tc_pairs_BASIN_MAP + - pass - 2.661 sec +TEST: tc_pairs_LEAD_REQ + - pass - 1.023 sec +TEST: tc_pairs_WRITE_VALID + - pass - 0.741 sec +TEST: tc_pairs_WRITE_VALID_PROBRIRW + - pass - 2.105 sec +TEST: tc_pairs_DIAGNOSTICS + - pass - 0.932 sec +TEST: tc_pairs_DIAGNOSTICS_CONSENSUS + - pass - 5.124 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_tc_stat.xml + +TEST: tc_stat_ALAL2010 + - pass - 24.263 sec +TEST: tc_stat_FILTER_STRINGS + - pass - 2.475 sec +TEST: tc_stat_PROBRIRW + - pass - 40.831 sec +TEST: tc_stat_LEAD_REQ + - pass - 1.483 sec +TEST: tc_stat_FALSE_ALARMS + - pass - 2.438 sec +TEST: tc_stat_DIAGNOSTICS + - pass - 7.112 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_plot_tc.xml + +TEST: plot_tc_TCMPR + - pass - 9.155 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_tc_rmw.xml + +TEST: tc_rmw_PRESSURE_LEV_OUT + - pass - 39.751 sec +TEST: tc_rmw_GONZALO + - pass - 9.403 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_rmw_analysis.xml + +TEST: rmw_analysis + - pass - 1.61 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_tc_diag.xml + +TEST: tc_diag_IAN + - pass - 117.499 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_tc_gen.xml + +TEST: tc_gen_2016 + - pass - 94.491 sec +TEST: tc_gen_prob + - pass - 1.05 sec +TEST: tc_gen_2021_shape + - pass - 9.322 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_met_test_scripts.xml + +TEST: test_all_gen_vx_mask_1 + - pass - 1.263 sec +TEST: test_all_gen_vx_mask_2 + - pass - 1.054 sec +TEST: test_all_gen_ens_prod + - pass - 1.72 sec +TEST: test_all_pcp_combine_1 + - pass - 1.855 sec +TEST: test_all_pcp_combine_2 + - pass - 3.97 sec +TEST: test_all_pcp_combine_3 + - pass - 5.307 sec +TEST: test_all_pcp_combine_4 + - pass - 1.066 sec +TEST: test_all_pcp_combine_5 + - pass - 1.075 sec +TEST: test_all_pcp_combine_6 + - pass - 1.563 sec +TEST: test_all_mode_1 + - pass - 2.505 sec +TEST: test_all_mode_2 + - pass - 2.463 sec +TEST: test_all_mode_3 + - pass - 3.169 sec +TEST: test_all_grid_stat_1 + - pass - 2.189 sec +TEST: test_all_grid_stat_2 + - pass - 0.646 sec +TEST: test_all_grid_stat_3 + - pass - 1.26 sec +TEST: test_all_grid_stat_4 + - pass - 8.559 sec +TEST: test_all_pb2nc + - pass - 3.645 sec +TEST: test_all_plot_point_obs + - pass - 5.294 sec +TEST: test_all_ascii2nc_1 + - pass - 0.599 sec +TEST: test_all_ascii2nc_2 + - pass - 0.715 sec +TEST: test_all_madis2nc + - pass - 1.521 sec +TEST: test_all_point_stat + - pass - 72.135 sec +TEST: test_all_wavelet_stat_1 + - pass - 4.892 sec +TEST: test_all_wavelet_stat_2 + - pass - 2.751 sec +TEST: test_all_ensemble_stat + - pass - 9.129 sec +TEST: test_all_stat_analysis + - pass - 15.05 sec +TEST: test_all_mode_analysis_1 + - pass - 0.713 sec +TEST: test_all_mode_analysis_2 + - pass - 0.561 sec +TEST: test_all_mode_analysis_3 + - pass - 0.574 sec +TEST: test_all_plot_data_plane_1 + - pass - 1.109 sec +TEST: test_all_plot_data_plane_2 + - pass - 0.662 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_modis.xml + +TEST: modis_regrid_SURFACE_TEMPERATURE + - pass - 3.171 sec +TEST: modis_regrid_CLOUD_FRACTION + - pass - 2.371 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ref_config_lead_00.xml + +TEST: gen_vx_mask + - pass - 2.464 sec +TEST: pb2nc_ndas_lead_00 + - pass - 8.045 sec +TEST: point_stat_lead_00_upper_air_AFWAv3.4_Noahv2.7.1 + - pass - 48.708 sec +TEST: point_stat_lead_00_surface_AFWAv3.4_Noahv2.7.1 + - pass - 24.294 sec +TEST: point_stat_lead_00_winds_AFWAv3.4_Noahv2.7.1 + - pass - 56.158 sec +TEST: point_stat_lead_00_upper_air_AFWAv3.4_Noahv3.3 + - pass - 48.776 sec +TEST: point_stat_lead_00_surface_AFWAv3.4_Noahv3.3 + - pass - 24.526 sec +TEST: point_stat_lead_00_winds_AFWAv3.4_Noahv3.3 + - pass - 56.448 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ref_config_lead_12.xml + +TEST: gen_vx_mask + - pass - 2.225 sec +TEST: pb2nc_ndas_lead_12 + - pass - 8.146 sec +TEST: pcp_combine_ST2ml_3hr_09_12 + - pass - 0.909 sec +TEST: pcp_combine_wrf_3hr_09_12 + - pass - 1.143 sec +TEST: point_stat_lead_12_upper_air_AFWAv3.4_Noahv2.7.1 + - pass - 48.863 sec +TEST: point_stat_lead_12_surface_AFWAv3.4_Noahv2.7.1 + - pass - 25.401 sec +TEST: point_stat_lead_12_winds_AFWAv3.4_Noahv2.7.1 + - pass - 55.69 sec +TEST: point_stat_lead_12_upper_air_AFWAv3.4_Noahv3.3 + - pass - 48.919 sec +TEST: point_stat_lead_12_surface_AFWAv3.4_Noahv3.3 + - pass - 25.006 sec +TEST: point_stat_lead_12_winds_AFWAv3.4_Noahv3.3 + - pass - 55.808 sec +TEST: grid_stat_3hr_accum_time_12 + - pass - 0.883 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ref_config_lead_24.xml + +TEST: gen_vx_mask + - pass - 2.181 sec +TEST: pb2nc_ndas_lead_24 + - pass - 7.926 sec +TEST: pcp_combine_ST2ml_3hr_21_24 + - pass - 0.76 sec +TEST: pcp_combine_wrf_3hr_21_24 + - pass - 1.115 sec +TEST: point_stat_lead_24_upper_air_AFWAv3.4_Noahv2.7.1 + - pass - 47.982 sec +TEST: point_stat_lead_24_surface_AFWAv3.4_Noahv2.7.1 + - pass - 23.972 sec +TEST: point_stat_lead_24_winds_AFWAv3.4_Noahv2.7.1 + - pass - 54.219 sec +TEST: point_stat_lead_24_upper_air_AFWAv3.4_Noahv3.3 + - pass - 48.24 sec +TEST: point_stat_lead_24_surface_AFWAv3.4_Noahv3.3 + - pass - 24.114 sec +TEST: point_stat_lead_24_winds_AFWAv3.4_Noahv3.3 + - pass - 54.602 sec +TEST: grid_stat_3hr_accum_time_24 + - pass - 0.892 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ref_config_lead_36.xml + +TEST: gen_vx_mask + - pass - 2.186 sec +TEST: pb2nc_ndas_lead_36 + - pass - 8.025 sec +TEST: pcp_combine_ST2ml_3hr_33_36 + - pass - 0.697 sec +TEST: pcp_combine_ST2ml_24hr_12_36 + - pass - 0.596 sec +TEST: pcp_combine_wrf_3hr_33_36 + - pass - 1.092 sec +TEST: pcp_combine_wrf_24hr_12_36 + - pass - 1.108 sec +TEST: point_stat_lead_36_upper_air_AFWAv3.4_Noahv2.7.1 + - pass - 48.391 sec +TEST: point_stat_lead_36_surface_AFWAv3.4_Noahv2.7.1 + - pass - 24.7 sec +TEST: point_stat_lead_36_winds_AFWAv3.4_Noahv2.7.1 + - pass - 54.79 sec +TEST: point_stat_lead_36_upper_air_AFWAv3.4_Noahv3.3 + - pass - 48.341 sec +TEST: point_stat_lead_36_surface_AFWAv3.4_Noahv3.3 + - pass - 24.723 sec +TEST: point_stat_lead_36_winds_AFWAv3.4_Noahv3.3 + - pass - 54.725 sec +TEST: grid_stat_3hr_accum_time_36 + - pass - 0.917 sec +TEST: grid_stat_24hr_accum_time_36 + - pass - 0.896 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ref_config_lead_48.xml + +TEST: gen_vx_mask + - pass - 2.193 sec +TEST: pb2nc_ndas_lead_48 + - pass - 8.014 sec +TEST: pcp_combine_ST2ml_3hr_45_48 + - pass - 0.72 sec +TEST: pcp_combine_wrf_3hr_45_48 + - pass - 1.101 sec +TEST: point_stat_lead_48_upper_air_AFWAv3.4_Noahv2.7.1 + - pass - 48.027 sec +TEST: point_stat_lead_48_surface_AFWAv3.4_Noahv2.7.1 + - pass - 24.714 sec +TEST: point_stat_lead_48_winds_AFWAv3.4_Noahv2.7.1 + - pass - 54.824 sec +TEST: point_stat_lead_48_upper_air_AFWAv3.4_Noahv3.3 + - pass - 48.048 sec +TEST: point_stat_lead_48_surface_AFWAv3.4_Noahv3.3 + - pass - 24.41 sec +TEST: point_stat_lead_48_winds_AFWAv3.4_Noahv3.3 + - pass - 55.39 sec +TEST: grid_stat_3hr_accum_time_48 + - pass - 0.893 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ref_config.xml + +TEST: stat_analysis_GO_Index + - pass - 1.468 sec +TEST: stat_analysis_GO_Index_out_stat + - pass - 1.179 sec +TEST: stat_analysis_SFC_SS_Index_out + - pass - 0.874 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_mode_graphics.xml + +TEST: mode_graphics_PLOT_MULTIPLE + - pass - 63.828 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_regrid.xml + +TEST: regrid_grid_stat_ST4_TO_HMT + - pass - 1.052 sec +TEST: regrid_grid_stat_HMT_TO_ST4 + - pass - 8.112 sec +TEST: regrid_grid_stat_BOTH_TO_DTC165 + - pass - 1.594 sec +TEST: regrid_grid_stat_BOTH_TO_NAM + - pass - 5.106 sec +TEST: regrid_grid_stat_BOTH_TO_HMT_D02 + - pass - 1.577 sec +TEST: regrid_data_plane_GFS_TO_HMT_NEAREST + - pass - 2.858 sec +TEST: regrid_data_plane_GFS_ROTLATLON_GRID_SPEC + - pass - 2.465 sec +TEST: regrid_data_plane_GFS_TO_HMT_BILIN + - pass - 2.782 sec +TEST: regrid_data_plane_GFS_TO_HMT_BUDGET + - pass - 2.539 sec +TEST: regrid_data_plane_GFS_TO_HMT_MIN_3 + - pass - 3.824 sec +TEST: regrid_data_plane_GFS_TO_HMT_MAX_3 + - pass - 3.887 sec +TEST: regrid_data_plane_GFS_TO_HMT_UW_MEAN_3 + - pass - 3.837 sec +TEST: regrid_data_plane_GFS_TO_HMT_UW_MEAN_9 + - pass - 19.083 sec +TEST: regrid_data_plane_GFS_TO_HMT_DW_MEAN_3 + - pass - 3.94 sec +TEST: regrid_data_plane_HRRR_MAXGAUSS + - pass - 4.686 sec +TEST: regrid_data_plane_GFS_TO_HMT_MEDIAN_3 + - pass - 3.921 sec +TEST: regrid_data_plane_GFS_TO_HMT_LS_FIT_3 + - pass - 3.982 sec +TEST: regrid_data_plane_GFS_TO_HMT_MAX_5_SQUARE + - pass - 1.92 sec +TEST: regrid_data_plane_GFS_TO_G212_CONVERT_CENSOR + - pass - 0.893 sec +TEST: regrid_data_plane_WRAP_LON + - pass - 1.438 sec +TEST: regrid_data_plane_NC_ROT_LAT_LON + - pass - 2.385 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_gsi_tools.xml + +TEST: gsid2mpr_CONV + - pass - 2.161 sec +TEST: gsid2mpr_DUP + - pass - 1.27 sec +TEST: gsid2mpr_RAD + - pass - 2.469 sec +TEST: gsidens2orank_CONV_NO_MEAN + - pass - 5.607 sec +TEST: gsidens2orank_CONV_ENS_MEAN + - pass - 4.809 sec +TEST: gsidens2orank_RAD + - pass - 4.108 sec +TEST: gsidens2orank_RAD_CHANNEL + - pass - 1.14 sec +TEST: stat_analysis_MPR_TO_CNT + - pass - 2.631 sec +TEST: stat_analysis_ORANK_TO_RHIST + - pass - 23.803 sec +TEST: stat_analysis_ORANK_TO_SSVAR + - pass - 23.779 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_aeronet.xml + +TEST: ascii2nc_AERONET_daily + - pass - 0.814 sec +TEST: ascii2nc_AERONET_v3_daily + - pass - 0.537 sec +TEST: ascii2nc_AERONET_v3_concat + - pass - 0.574 sec +TEST: ascii2nc_AERONET_vld_thresh + - pass - 0.545 sec +TEST: ascii2nc_AERONET_monthly + - pass - 0.698 sec +TEST: point_stat_GRIB2_f18_NGAC_AERONET_daily + - pass - 0.59 sec +TEST: point_stat_GRIB2_f18_NGAC_AERONET_monthly + - pass - 0.639 sec +TEST: point_stat_GRIB2_f21_NGAC_AERONET_daily + - pass - 0.595 sec +TEST: point_stat_GRIB2_f21_NGAC_AERONET_monthly + - pass - 0.639 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_shift_data_plane.xml + +TEST: shift_data_plane_GRIB1 + - pass - 4.387 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_mtd.xml + +TEST: mtd_basic + - pass - 45.285 sec +TEST: mtd_conv_time + - pass - 47.593 sec +TEST: mtd_single + - pass - 11.058 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_climatology_1.0deg.xml + +TEST: climatology_POINT_STAT_GFS_1.0DEG + - pass - 128.342 sec +TEST: climatology_POINT_STAT_GFS_1.0DEG_CLIMO_PREV_MONTH + - pass - 127.347 sec +TEST: climatology_POINT_STAT_PROB_GFS_1.0DEG + - pass - 11.152 sec +TEST: climatology_GRID_STAT_PROB_GFS_1.0DEG + - pass - 8.402 sec +TEST: climatology_STAT_ANALYSIS_1.0DEG + - pass - 3.124 sec +TEST: climatology_SERIES_ANALYSIS_1.0DEG + - pass - 168.005 sec +TEST: climatology_SERIES_ANALYSIS_1.0DEG_CONST_CLIMO + - pass - 49.084 sec +TEST: climatology_SERIES_ANALYSIS_1.0DEG_AGGR + - pass - 215.56 sec +TEST: climatology_SERIES_ANALYSIS_PROB_1.0DEG + - pass - 20.95 sec +TEST: climatology_SERIES_ANALYSIS_PROB_1.0DEG_AGGR + - pass - 24.843 sec +TEST: climatology_ENSEMBLE_STAT_1.0DEG + - pass - 31.429 sec +TEST: climatology_ENSEMBLE_STAT_1.0DEG_ONE_CDF_BIN + - pass - 11.051 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_climatology_1.5deg.xml + +TEST: climatology_POINT_STAT_WMO_1.5DEG + - pass - 245.695 sec +TEST: climatology_STAT_ANALYSIS_WMO_1.5DEG_MPR_AGG_STAT + - pass - 0.741 sec +TEST: climatology_STAT_ANALYSIS_WMO_1.5DEG_VAL1L2_AGG_STAT + - pass - 0.607 sec +TEST: climatology_STAT_ANALYSIS_WMO_1.5DEG_FILTER + - pass - 0.758 sec +TEST: climatology_GRID_STAT_WMO_1.5DEG + - pass - 253.773 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_climatology_2.5deg.xml + +TEST: climatology_POINT_STAT_GFS_2.5DEG + - pass - 207.139 sec +TEST: climatology_GRID_STAT_WRAP_YEAR_2.5DEG + - pass - 126.605 sec +TEST: climatology_GRID_STAT_SINGLE_MONTH_2.5DEG + - pass - 64.927 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_climatology_mixed.xml + +TEST: climatology_GRID_STAT_FCST_NCEP_1.0DEG_OBS_WMO_1.5DEG + - pass - 72.612 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_grib_tables.xml + +TEST: GRIB1_um_dcf + - pass - 2.76 sec +TEST: GRIB2_um_raw + - pass - 2.366 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_grid_weight.xml + +TEST: grid_weight_GRID_STAT_NONE + - pass - 6.082 sec +TEST: grid_weight_GRID_STAT_COS_LAT + - pass - 6.075 sec +TEST: grid_weight_GRID_STAT_AREA + - pass - 6.081 sec +TEST: grid_weight_ENSEMBLE_STAT_NONE + - pass - 2.327 sec +TEST: grid_weight_ENSEMBLE_STAT_COS_LAT + - pass - 2.287 sec +TEST: grid_weight_ENSEMBLE_STAT_AREA + - pass - 2.291 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_netcdf.xml + +TEST: ascii2nc_no_compression + - pass - 16.393 sec +TEST: ascii2nc_compression2_by_config + - pass - 16.728 sec +TEST: ascii2nc_compression3_by_env + - pass - 16.411 sec +TEST: ascii2nc_compression4_by_argument + - pass - 16.483 sec +TEST: 365_days + - pass - 1.913 sec +TEST: netcdf_1byte_time + - pass - 0.934 sec +TEST: netcdf_months_units + - pass - 0.653 sec +TEST: netcdf_months_units_from_day2 + - pass - 0.535 sec +TEST: netcdf_months_units_to_next_month + - pass - 0.537 sec +TEST: netcdf_years_units + - pass - 0.535 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_hira.xml + +WARNING: unable to read test_dir from /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_hira.xml +TEST: point_stat_NCMET_NAM_HMTGAGE_HIRA + - pass - 10.392 sec +TEST: point_stat_HIRA_EMPTY_PROB_CAT_THRESH + - pass - 9.472 sec +TEST: stat_analysis_CONFIG_HIRA + - pass - 4.317 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_interp_shape.xml + +TEST: grid_stat_INTERP_SQUARE + - pass - 6.366 sec +TEST: grid_stat_INTERP_CIRCLE + - pass - 5.351 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_lidar2nc.xml + +TEST: lidar2nc_CALIPSO + - pass - 1.126 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_ioda2nc.xml + +TEST: ioda2nc_mask_sid_list + - pass - 1.612 sec +TEST: ioda2nc_var_all + - pass - 0.549 sec +TEST: ioda2nc_summary + - pass - 0.566 sec +TEST: ioda2nc_same_input + - pass - 0.565 sec +TEST: ioda2nc_int_datetime + - pass - 0.575 sec +TEST: ioda2nc_v2_string_sid + - pass - 0.576 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_airnow.xml + +TEST: pb2nc_AIRNOW + - pass - 16.993 sec +TEST: point_stat_GRIB2_AIRNOW + - pass - 2.994 sec + +CALLING: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/xml/unit_python.xml + +TEST: python_numpy_grid_name + - pass - 1.728 sec +TEST: python_numpy_grid_string + - pass - 1.147 sec +TEST: python_numpy_grid_data_file + - pass - 1.163 sec +TEST: python_numpy_plot_data_plane + - pass - 1.221 sec +TEST: python_xarray_plot_data_plane + - pass - 1.193 sec +TEST: python_numpy_plot_data_plane_missing + - FAIL - 0.533 sec +export MET_PYTHON_EXE=${MET_TEST_MET_PYTHON_EXE} +/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/plot_data_plane \ + PYTHON_NUMPY \ + /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/python/letter_numpy_0_to_missing.ps \ + 'name = "/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/python/examples/read_ascii_numpy.py /d1/projects/MET/MET_test_data/unit_test/python/letter.txt LETTER 0.0";' \ + -plot_range 0.0 255.0 \ + -title "Python enabled numpy plot_data_plane" \ + -v 1 +DEBUG 1: Start plot_data_plane by johnhg(6088) at 2024-10-07 17:45:28Z cmd: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/../../bin/plot_data_plane PYTHON_NUMPY /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../test_output/python/letter_numpy_0_to_missing.ps name = "/d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/python/examples/read_ascii_numpy.py /d1/projects/MET/MET_test_data/unit_test/python/letter.txt LETTER 0.0"; -plot_range 0.0 255.0 -title Python enabled numpy plot_data_plane -v 1 +DEBUG 1: Opening data file: PYTHON_NUMPY +sh: 1: /usr/local/python3/bin/python3: not found +ERROR : +ERROR : tmp_nc_dataplane() -> command "${MET_TEST_MET_PYTHON_EXE} /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/python/pyembed/write_tmp_dataplane.py /tmp/tmp_met_data_386958_0 /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/../../share/met/python/examples/read_ascii_numpy /d1/projects/MET/MET_test_data/unit_test/python/letter.txt LETTER 0.0" failed ... status = 32512 +ERROR : +unset MET_PYTHON_EXE + + +ERROR: /d1/personal/johnhg/MET/MET_development/MET-feature_2887_categorical_weights/internal/test_unit/python/unit.py unit_python.xml failed. + +*** UNIT TESTS FAILED *** + diff --git a/internal/test_unit/xml/unit_grid_weight.xml b/internal/test_unit/xml/unit_grid_weight.xml index 85005feec1..979ebad495 100644 --- a/internal/test_unit/xml/unit_grid_weight.xml +++ b/internal/test_unit/xml/unit_grid_weight.xml @@ -22,6 +22,7 @@ &MET_BIN;/grid_stat OUTPUT_PREFIX NO_WEIGHT + DESC NO_WEIGHT CLIMO_MEAN_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cmean_1d.19790410 GRID_WEIGHT NONE @@ -32,7 +33,7 @@ -outdir &OUTPUT_DIR;/grid_weight -v 1 - &OUTPUT_DIR;/grid_weight/grid_stat_NO_WEIGHT_240000L_20120410_000000V.stat + &OUTPUT_DIR;/grid_weight/grid_stat_NO_WEIGHT_240000L_20120410_000000V.stat &OUTPUT_DIR;/grid_weight/grid_stat_NO_WEIGHT_240000L_20120410_000000V_pairs.nc @@ -41,6 +42,7 @@ &MET_BIN;/grid_stat OUTPUT_PREFIX COS_LAT_WEIGHT + DESC COS_LAT_WEIGHT CLIMO_MEAN_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cmean_1d.19790410 GRID_WEIGHT COS_LAT @@ -51,7 +53,7 @@ -outdir &OUTPUT_DIR;/grid_weight -v 1 - &OUTPUT_DIR;/grid_weight/grid_stat_COS_LAT_WEIGHT_240000L_20120410_000000V.stat + &OUTPUT_DIR;/grid_weight/grid_stat_COS_LAT_WEIGHT_240000L_20120410_000000V.stat &OUTPUT_DIR;/grid_weight/grid_stat_COS_LAT_WEIGHT_240000L_20120410_000000V_pairs.nc @@ -60,6 +62,7 @@ &MET_BIN;/grid_stat OUTPUT_PREFIX AREA_WEIGHT + DESC AREA_WEIGHT CLIMO_MEAN_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cmean_1d.19790410 GRID_WEIGHT AREA @@ -70,7 +73,7 @@ -outdir &OUTPUT_DIR;/grid_weight -v 1 - &OUTPUT_DIR;/grid_weight/grid_stat_AREA_WEIGHT_240000L_20120410_000000V.stat + &OUTPUT_DIR;/grid_weight/grid_stat_AREA_WEIGHT_240000L_20120410_000000V.stat &OUTPUT_DIR;/grid_weight/grid_stat_AREA_WEIGHT_240000L_20120410_000000V_pairs.nc @@ -78,9 +81,11 @@ &MET_BIN;/ensemble_stat - OUTPUT_PREFIX NO_WEIGHT - CLIMO_MEAN_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cmean_1d.19790410 - GRID_WEIGHT NONE + OUTPUT_PREFIX NO_WEIGHT + DESC NO_WEIGHT + CLIMO_MEAN_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cmean_1d.19790410 + CLIMO_STDEV_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cstdv_1d.19790410 + GRID_WEIGHT NONE \ 6 \ @@ -104,7 +109,9 @@ &MET_BIN;/ensemble_stat OUTPUT_PREFIX COS_LAT_WEIGHT + DESC COS_LAT_WEIGHT CLIMO_MEAN_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cmean_1d.19790410 + CLIMO_STDEV_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cstdv_1d.19790410 GRID_WEIGHT COS_LAT \ @@ -129,7 +136,9 @@ &MET_BIN;/ensemble_stat OUTPUT_PREFIX AREA_WEIGHT + DESC AREA_WEIGHT CLIMO_MEAN_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cmean_1d.19790410 + CLIMO_STDEV_FILE &DATA_DIR_CLIMO;/NCEP_1.0deg/cstdv_1d.19790410 GRID_WEIGHT AREA \ diff --git a/src/libcode/vx_stat_out/stat_columns.cc b/src/libcode/vx_stat_out/stat_columns.cc index d971fe2cf7..ac8530f6a5 100644 --- a/src/libcode/vx_stat_out/stat_columns.cc +++ b/src/libcode/vx_stat_out/stat_columns.cc @@ -101,19 +101,19 @@ ConcatString append_climo_bin(const ConcatString &mask_name, void write_header_row(const char * const * cols, int n_cols, int hdr_flag, AsciiTable &at, int r, int c) { - int i; // Write the header column names if requested if(hdr_flag) { - for(i=0; in_obs; i++) { + for(int i=0; in_obs; i++) { // Set the observation valid time shc.set_obs_valid_beg(pd_ptr->vld_ta[i]); @@ -1700,7 +1699,6 @@ void write_isc_row(StatHdrColumns &shc, const ISCInfo &isc_info, STATOutputType out_type, AsciiTable &stat_at, int &stat_row, AsciiTable &txt_at, int &txt_row) { - int i; // ISC line type shc.set_line_type(stat_isc_str); @@ -1714,7 +1712,7 @@ void write_isc_row(StatHdrColumns &shc, const ISCInfo &isc_info, // Write a line for each scale plus one for the thresholded binary // field and one for the father wavelet - for(i=-1; i<=isc_info.n_scale; i++) { + for(int i=-1; i<=isc_info.n_scale; i++) { // Write the header columns write_header_cols(shc, stat_at, stat_row); @@ -1902,7 +1900,6 @@ void write_orank_row(StatHdrColumns &shc, const PairDataEnsemble *pd_ptr, STATOutputType out_type, AsciiTable &stat_at, int &stat_row, AsciiTable &txt_at, int &txt_row) { - int i; // Observation Rank line type shc.set_line_type(stat_orank_str); @@ -1914,7 +1911,7 @@ void write_orank_row(StatHdrColumns &shc, const PairDataEnsemble *pd_ptr, shc.set_alpha(bad_data_double); // Write a line for each ensemble pair - for(i=0; in_obs; i++) { + for(int i=0; in_obs; i++) { // Set the observation valid time shc.set_obs_valid_beg(pd_ptr->vld_ta[i]); @@ -1947,7 +1944,6 @@ void write_ssvar_row(StatHdrColumns &shc, const PairDataEnsemble *pd_ptr, double alpha, STATOutputType out_type, AsciiTable &stat_at, int &stat_row, AsciiTable &txt_at, int &txt_row) { - int i; // SSVAR line type shc.set_line_type(stat_ssvar_str); @@ -1961,7 +1957,7 @@ void write_ssvar_row(StatHdrColumns &shc, const PairDataEnsemble *pd_ptr, shc.set_alpha(alpha); // Write a line for each ssvar bin - for(i=0; issvar_bins[0].n_bin; i++) { + for(int i=0; issvar_bins[0].n_bin; i++) { // Write the header columns write_header_cols(shc, stat_at, stat_row); @@ -2088,7 +2084,7 @@ void write_fho_cols(const CTSInfo &cts_info, // O_RATE // at.set_entry(r, c+0, // Total Count - cts_info.cts.n()); + cts_info.cts.n_pairs()); at.set_entry(r, c+1, // Forecast Rate = FY/N cts_info.cts.f_rate()); @@ -2114,7 +2110,7 @@ void write_ctc_cols(const CTSInfo &cts_info, // FN_OY, FN_ON, EC_VALUE // at.set_entry(r, c+0, // Total Count - cts_info.cts.n()); + cts_info.cts.n_pairs()); at.set_entry(r, c+1, // FY_OY cts_info.cts.fy_oy()); @@ -2167,7 +2163,7 @@ void write_cts_cols(const CTSInfo &cts_info, int i, // EC_VALUE // at.set_entry(r, c+0, // Total count - cts_info.cts.n()); + cts_info.cts.n_pairs()); at.set_entry(r, c+1, // Base Rate (oy_tp) cts_info.baser.v); @@ -2805,15 +2801,14 @@ void write_cnt_cols(const CNTInfo &cnt_info, int i, void write_mctc_cols(const MCTSInfo &mcts_info, AsciiTable &at, int r, int c) { - int i, j, col; // // Multi-Category Contingency Table Counts // Dump out the MCTC line: // TOTAL, N_CAT, Fi_Oj, EC_VALUE // - at.set_entry(r, c+0, // Total Count - mcts_info.cts.total()); + at.set_entry(r, c+0, // Total number of pairs + mcts_info.cts.n_pairs()); at.set_entry(r, c+1, // Number of categories mcts_info.cts.nrows()); @@ -2821,8 +2816,9 @@ void write_mctc_cols(const MCTSInfo &mcts_info, // // Loop through the contingency table rows and columns // - for(i=0, col=c+2; irhist_na.n_elements(); i++) { + int col = c+2; + for(int i=0; irhist_na.n_elements(); i++) { at.set_entry(r, col, // RANK_i nint(pd_ptr->rhist_na[i])); @@ -4486,7 +4482,6 @@ void write_rhist_cols(const PairDataEnsemble *pd_ptr, void write_phist_cols(const PairDataEnsemble *pd_ptr, AsciiTable &at, int r, int c) { - int i, col; // // Probability Integral Transform Histogram @@ -4505,7 +4500,8 @@ void write_phist_cols(const PairDataEnsemble *pd_ptr, // // Write BIN_i count for each bin // - for(i=0, col=c+3; iphist_na.n_elements(); i++) { + int col = c+3; + for(int i=0; iphist_na.n_elements(); i++) { at.set_entry(r, col, // BIN_i nint(pd_ptr->phist_na[i])); @@ -4519,7 +4515,6 @@ void write_phist_cols(const PairDataEnsemble *pd_ptr, void write_orank_cols(const PairDataEnsemble *pd_ptr, int i, AsciiTable &at, int r, int c) { - int j, col; // // Ensemble Observation Rank Matched Pairs @@ -4573,7 +4568,8 @@ void write_orank_cols(const PairDataEnsemble *pd_ptr, int i, // // Write ENS_j for each ensemble member // - for(j=0, col=c+12; jn_ens; j++) { + int col = c+12; + for(int j=0; jn_ens; j++) { at.set_entry(r, col, // ENS_j pd_ptr->e_na[j][i]); @@ -4767,7 +4763,6 @@ void write_ssvar_cols(const PairDataEnsemble *pd_ptr, int i, void write_relp_cols(const PairDataEnsemble *pd_ptr, AsciiTable &at, int r, int c) { - int i, col; // // Relative Position @@ -4783,7 +4778,8 @@ void write_relp_cols(const PairDataEnsemble *pd_ptr, // // Write RELP_i count for each bin // - for(i=0, col=c+2; irelp_na.n_elements(); i++) { + int col = c+2; + for(int i=0; irelp_na.n_elements(); i++) { at.set_entry(r, col, // RELP_i pd_ptr->relp_na[i]); diff --git a/src/libcode/vx_statistics/compute_ci.cc b/src/libcode/vx_statistics/compute_ci.cc index 8296cc5d76..3633d61acb 100644 --- a/src/libcode/vx_statistics/compute_ci.cc +++ b/src/libcode/vx_statistics/compute_ci.cc @@ -61,14 +61,14 @@ void compute_normal_ci(double v, double alpha, double se, // //////////////////////////////////////////////////////////////////////// -void compute_proportion_ci(double p, int n, double alpha, double vif, +void compute_proportion_ci(double p, int n_pairs, double alpha, double vif, double &p_cl, double &p_cu) { // // Compute the confidence interval using the Wilson method for all // sizes of n, since it provides a better approximation // - compute_wilson_ci(p, n, alpha, vif, p_cl, p_cu); + compute_wilson_ci(p, n_pairs, alpha, vif, p_cl, p_cu); return; } @@ -81,7 +81,7 @@ void compute_proportion_ci(double p, int n, double alpha, double vif, // //////////////////////////////////////////////////////////////////////// -void compute_wald_ci(double p, int n, double alpha, double vif, +void compute_wald_ci(double p, int n_pairs, double alpha, double vif, double &p_cl, double &p_cu) { double v, cv_normal_l, cv_normal_u; @@ -100,7 +100,7 @@ void compute_wald_ci(double p, int n, double alpha, double vif, // // Compute the upper and lower bounds of the confidence interval // - v = vif*p*(1.0-p)/n; + v = vif*p*(1.0-p)/n_pairs; if(v < 0.0) { p_cl = bad_data_double; @@ -122,10 +122,10 @@ void compute_wald_ci(double p, int n, double alpha, double vif, // //////////////////////////////////////////////////////////////////////// -void compute_wilson_ci(double p, int n_int, double alpha, double vif, +void compute_wilson_ci(double p, int n_pairs, double alpha, double vif, double &p_cl, double &p_cu) { double v, cv_normal_l, cv_normal_u; - long long n = n_int; + long long n = n_pairs; if(is_bad_data(p)) { p_cl = p_cu = bad_data_double; diff --git a/src/libcode/vx_statistics/compute_ci.h b/src/libcode/vx_statistics/compute_ci.h index ddaf68d36a..5617ced8f3 100644 --- a/src/libcode/vx_statistics/compute_ci.h +++ b/src/libcode/vx_statistics/compute_ci.h @@ -28,13 +28,13 @@ static const int wald_sample_threshold = 100; extern void compute_normal_ci(double x, double alpha, double se, double &cl, double &cu); -extern void compute_proportion_ci(double p, int n, double alpha, +extern void compute_proportion_ci(double p, int n_pairs, double alpha, double vif, double &p_cl, double &p_cu); -extern void compute_wald_ci(double p, int n, double alpha, +extern void compute_wald_ci(double p, int n_pairs, double alpha, double vif, double &p_cl, double &p_cu); -extern void compute_wilson_ci(double p, int n, double alpha, +extern void compute_wilson_ci(double p, int n_pairs, double alpha, double vif, double &p_cl, double &p_cu); extern void compute_woolf_ci(double odds, double alpha, diff --git a/src/libcode/vx_statistics/compute_stats.cc b/src/libcode/vx_statistics/compute_stats.cc index 094587b732..3ebd4b9058 100644 --- a/src/libcode/vx_statistics/compute_stats.cc +++ b/src/libcode/vx_statistics/compute_stats.cc @@ -576,7 +576,7 @@ void compute_ctsinfo(const PairDataPoint &pd, const NumArray &i_na, // ClimoPntInfo cpi(pd.fcmn_na[j], pd.fcsd_na[j], pd.ocmn_na[j], pd.ocsd_na[j]); - cts_info.add(pd.f_na[j], pd.o_na[j], &cpi); + cts_info.add(pd.f_na[j], pd.o_na[j], pd.wgt_na[j], &cpi); } // end for i @@ -675,7 +675,7 @@ void compute_mctsinfo(const PairDataPoint &pd, const NumArray &i_na, // ClimoPntInfo cpi(pd.fcmn_na[j], pd.fcsd_na[j], pd.ocmn_na[j], pd.ocsd_na[j]); - mcts_info.add(pd.f_na[j], pd.o_na[j], &cpi); + mcts_info.add(pd.f_na[j], pd.o_na[j], pd.wgt_na[j], &cpi); } // end for i @@ -811,12 +811,12 @@ void compute_pctinfo(const PairDataPoint &pd, bool pstd_flag, // Check the observation thresholds and increment accordingly // if(pct_info.othresh.check(pd.o_na[i], &cpi)) { - pct_info.pct.inc_event(pd.f_na[i]); - if(cmn_flag) pct_info.climo_pct.inc_event(climo_prob[i]); + pct_info.pct.inc_event(pd.f_na[i], pd.wgt_na[i]); + if(cmn_flag) pct_info.climo_pct.inc_event(climo_prob[i], pd.wgt_na[i]); } else { - pct_info.pct.inc_nonevent(pd.f_na[i]); - if(cmn_flag) pct_info.climo_pct.inc_nonevent(climo_prob[i]); + pct_info.pct.inc_nonevent(pd.f_na[i], pd.wgt_na[i]); + if(cmn_flag) pct_info.climo_pct.inc_nonevent(climo_prob[i], pd.wgt_na[i]); } } // end for i diff --git a/src/libcode/vx_statistics/contable.cc b/src/libcode/vx_statistics/contable.cc index a5b2ab49a1..b0e2e2d5e1 100644 --- a/src/libcode/vx_statistics/contable.cc +++ b/src/libcode/vx_statistics/contable.cc @@ -74,6 +74,9 @@ ContingencyTable & ContingencyTable::operator+=(const ContingencyTable & t) { exit(1); } + // Increment the number of pairs + Npairs += t.Npairs; + // Increment table entries for(int i=0; i E; + // This is really a two-dimensional array (Nrows, Ncols) + std::vector E; - int Nrows; - int Ncols; + int Nrows; + int Ncols; - double ECvalue; + int Npairs; + double ECvalue; - ConcatString Name; + ConcatString Name; public: @@ -67,6 +69,7 @@ class ContingencyTable { virtual void set_size(int); virtual void set_size(int NR, int NC); + void set_n_pairs(int); void set_ec_value(double); void set_name(const char *); @@ -74,6 +77,7 @@ class ContingencyTable { int nrows() const; int ncols() const; + int n_pairs() const; double ec_value() const; ConcatString name() const; @@ -110,6 +114,7 @@ class ContingencyTable { inline int ContingencyTable::nrows() const { return Nrows; } inline int ContingencyTable::ncols() const { return Ncols; } +inline int ContingencyTable::n_pairs() const { return Npairs; } inline double ContingencyTable::ec_value() const { return ECvalue; } inline ConcatString ContingencyTable::name() const { return Name; } @@ -159,18 +164,16 @@ class Nx2ContingencyTable : public ContingencyTable { void inc_nonevent (double value, double weight=1.0); // Get table entries - double event_count_by_thresh(double) const; - double nonevent_count_by_thresh(double) const; + double event_total_by_thresh(double) const; + double nonevent_total_by_thresh(double) const; - double event_count_by_row(int row) const; - double nonevent_count_by_row(int row) const; + double event_total_by_row(int row) const; + double nonevent_total_by_row(int row) const; // Set counts void set_event(int row, double); void set_nonevent(int row, double); - double n() const; - // Column totals double event_col_total() const; double nonevent_col_total() const; @@ -202,8 +205,8 @@ class Nx2ContingencyTable : public ContingencyTable { //////////////////////////////////////////////////////////////////////// -inline double Nx2ContingencyTable::event_count_by_row (int row) const { return entry(row, nx2_event_column); } -inline double Nx2ContingencyTable::nonevent_count_by_row (int row) const { return entry(row, nx2_nonevent_column); } +inline double Nx2ContingencyTable::event_total_by_row (int row) const { return entry(row, nx2_event_column); } +inline double Nx2ContingencyTable::nonevent_total_by_row (int row) const { return entry(row, nx2_nonevent_column); } inline double Nx2ContingencyTable::event_col_total () const { return col_total(nx2_event_column); } inline double Nx2ContingencyTable::nonevent_col_total () const { return col_total(nx2_nonevent_column); } @@ -253,8 +256,6 @@ class TTContingencyTable : public ContingencyTable { double fn() const; double fy() const; - double n() const; - // FHO rates where: // f_rate = FY/N // h_rate = fy_oy/N diff --git a/src/libcode/vx_statistics/contable_nx2.cc b/src/libcode/vx_statistics/contable_nx2.cc index 15d6a18b67..b41ec7a798 100644 --- a/src/libcode/vx_statistics/contable_nx2.cc +++ b/src/libcode/vx_statistics/contable_nx2.cc @@ -90,12 +90,6 @@ void Nx2ContingencyTable::assign(const Nx2ContingencyTable & t) { //////////////////////////////////////////////////////////////////////// -double Nx2ContingencyTable::n() const { - return total(); -} - -//////////////////////////////////////////////////////////////////////// - void Nx2ContingencyTable::set_size(int N) { ContingencyTable::set_size(N, 2); return; @@ -205,11 +199,11 @@ void Nx2ContingencyTable::inc_nonevent(double t, double weight) { //////////////////////////////////////////////////////////////////////// -double Nx2ContingencyTable::event_count_by_thresh(double t) const { +double Nx2ContingencyTable::event_total_by_thresh(double t) const { int r = value_to_row(t); if(r < 0) { - mlog << Error << "\nNx2ContingencyTable::event_count_by_thresh(double) -> " + mlog << Error << "\nNx2ContingencyTable::event_total_by_thresh(double) -> " << "bad value ... " << t << "\n\n"; exit(1); } @@ -219,11 +213,11 @@ double Nx2ContingencyTable::event_count_by_thresh(double t) const { //////////////////////////////////////////////////////////////////////// -double Nx2ContingencyTable::nonevent_count_by_thresh(double t) const { +double Nx2ContingencyTable::nonevent_total_by_thresh(double t) const { int r = value_to_row(t); if(r < 0) { - mlog << Error << "\nNx2ContingencyTable::nonevent_count_by_thresh(double) -> " + mlog << Error << "\nNx2ContingencyTable::nonevent_total_by_thresh(double) -> " << "bad value ... " << t << "\n\n"; exit(1); } @@ -241,6 +235,8 @@ void Nx2ContingencyTable::set_event(int row, double value) { exit(1); } + // Number of pairs defined by set_n_pairs(int) + set_entry(row, nx2_event_column, value); return; @@ -256,6 +252,8 @@ void Nx2ContingencyTable::set_nonevent(int row, double value) { exit(1); } + // Number of pairs defined by set_n_pairs(int) + set_entry(row, nx2_nonevent_column, value); return; @@ -264,7 +262,7 @@ void Nx2ContingencyTable::set_nonevent(int row, double value) { //////////////////////////////////////////////////////////////////////// double Nx2ContingencyTable::baser() const { - return compute_proportion(event_col_total(), n()); + return compute_proportion(event_col_total(), total()); } //////////////////////////////////////////////////////////////////////// @@ -273,7 +271,7 @@ double Nx2ContingencyTable::baser_ci(double alpha, double &cl, double &cu) const { double v = baser(); - compute_proportion_ci(v, n(), alpha, 1.0, cl, cu); + compute_proportion_ci(v, Npairs, alpha, 1.0, cl, cu); return v; } @@ -285,27 +283,27 @@ double Nx2ContingencyTable::brier_score() const { if(E.empty()) return bad_data_double; double sum = 0.0; - double count; + double row_total; double yi; double t; // Terms for event for(int j=0; j 1 so that degf > 0 in the call to gsl_cdf_tdist_Pinv() @@ -337,8 +335,8 @@ double Nx2ContingencyTable::brier_ci_halfwidth(double alpha) const { for(int j=0; j 0) { + if(cts_info[m].cts.n_pairs() == 0) continue; + // Write out FHO + if(conf_info.vx_opt[i].output_flag[i_fho] != STATOutputType::None) { write_fho_row(shc, cts_info[m], conf_info.vx_opt[i].output_flag[i_fho], stat_at, i_stat_row, @@ -964,9 +965,7 @@ void process_scores() { } // Write out CTC - if(conf_info.vx_opt[i].output_flag[i_ctc] != STATOutputType::None && - cts_info[m].cts.n() > 0) { - + if(conf_info.vx_opt[i].output_flag[i_ctc] != STATOutputType::None) { write_ctc_row(shc, cts_info[m], conf_info.vx_opt[i].output_flag[i_ctc], stat_at, i_stat_row, @@ -974,9 +973,7 @@ void process_scores() { } // Write out CTS - if(conf_info.vx_opt[i].output_flag[i_cts] != STATOutputType::None && - cts_info[m].cts.n() > 0) { - + if(conf_info.vx_opt[i].output_flag[i_cts] != STATOutputType::None) { write_cts_row(shc, cts_info[m], conf_info.vx_opt[i].output_flag[i_cts], stat_at, i_stat_row, @@ -984,9 +981,7 @@ void process_scores() { } // Write out ECLV - if(conf_info.vx_opt[i].output_flag[i_eclv] != STATOutputType::None && - cts_info[m].cts.n() > 0) { - + if(conf_info.vx_opt[i].output_flag[i_eclv] != STATOutputType::None) { write_eclv_row(shc, cts_info[m], conf_info.vx_opt[i].eclv_points, conf_info.vx_opt[i].output_flag[i_eclv], stat_at, i_stat_row, @@ -1007,10 +1002,10 @@ void process_scores() { // Compute MCTS do_mcts(mcts_info, i, &pd); - // Write out MCTC - if(conf_info.vx_opt[i].output_flag[i_mctc] != STATOutputType::None && - mcts_info.cts.total() > 0) { + if(mcts_info.cts.n_pairs() == 0) continue; + // Write out MCTC + if(conf_info.vx_opt[i].output_flag[i_mctc] != STATOutputType::None) { write_mctc_row(shc, mcts_info, conf_info.vx_opt[i].output_flag[i_mctc], stat_at, i_stat_row, @@ -1018,9 +1013,7 @@ void process_scores() { } // Write out MCTS - if(conf_info.vx_opt[i].output_flag[i_mcts] != STATOutputType::None && - mcts_info.cts.total() > 0) { - + if(conf_info.vx_opt[i].output_flag[i_mcts] != STATOutputType::None) { write_mcts_row(shc, mcts_info, conf_info.vx_opt[i].output_flag[i_mcts], stat_at, i_stat_row, @@ -1713,7 +1706,7 @@ void process_scores() { for(n=0; n 0 - if(nbrcts_info[n].cts_info.cts.n() > 0) { + if(nbrcts_info[n].cts_info.cts.n_pairs() > 0) { // Write out NBRCTC if(conf_info.vx_opt[i].output_flag[i_nbrctc] != STATOutputType::None) { @@ -2481,7 +2474,7 @@ void do_pct(const GridStatVxOpt &vx_opt, const PairDataPoint *pd_ptr) { } // Compute the probabilistic counts and statistics - compute_pctinfo(pd, ( STATOutputType::None!=vx_opt.output_flag[i_pstd]), pct_info[j]); + compute_pctinfo(pd, (STATOutputType::None!=vx_opt.output_flag[i_pstd]), pct_info[j]); // Check for no matched pairs to process if(pd.n_obs == 0) continue; diff --git a/src/tools/core/grid_stat/grid_stat_conf_info.cc b/src/tools/core/grid_stat/grid_stat_conf_info.cc index 19c5a48e83..d334804850 100644 --- a/src/tools/core/grid_stat/grid_stat_conf_info.cc +++ b/src/tools/core/grid_stat/grid_stat_conf_info.cc @@ -228,6 +228,19 @@ void GridStatConfInfo::process_config(GrdFileType ftype, // Summarize output flags across all verification tasks process_flags(); + // FHO output is not compatible with grid weights + if(output_flag[i_fho] != STATOutputType::None && + grid_weight_flag != GridWeightType::None) { + + mlog << Warning << "\nGridStatConfInfo::process_config() -> " + << "Disabling FHO output that is not compatible with grid weighting. " + << "Set \"grid_weight_flag = NONE\" to write FHO output.\n\n"; + + // Disable FHO output + for(i=0; i 0) { + if(mcts_info.cts.n_pairs() == 0) continue; + // Write out MCTC + if(conf_info.vx_opt[i_vx].output_flag[i_mctc] != STATOutputType::None) { write_mctc_row(shc, mcts_info, conf_info.vx_opt[i_vx].output_flag[i_mctc], stat_at, i_stat_row, @@ -1212,9 +1212,7 @@ void process_scores() { } // Write out MCTS - if(conf_info.vx_opt[i_vx].output_flag[i_mcts] != STATOutputType::None && - mcts_info.cts.total() > 0) { - + if(conf_info.vx_opt[i_vx].output_flag[i_mcts] != STATOutputType::None) { write_mcts_row(shc, mcts_info, conf_info.vx_opt[i_vx].output_flag[i_mcts], stat_at, i_stat_row, diff --git a/src/tools/core/series_analysis/series_analysis.cc b/src/tools/core/series_analysis/series_analysis.cc index a817768c3a..fa2d4a8ef7 100644 --- a/src/tools/core/series_analysis/series_analysis.cc +++ b/src/tools/core/series_analysis/series_analysis.cc @@ -1499,8 +1499,12 @@ void read_aggr_mctc(int n, const MCTSInfo &mcts_info, // Get the n-th value double v = aggr_data[var_name].buf()[n]; + // Store the number of pairs + if(c == "TOTAL" && !is_bad_data(v)) { + aggr_mcts.cts.set_n_pairs(nint(v)); + } // Check the number of categories - if(c == "N_CAT" && !is_bad_data(v) && + else if(c == "N_CAT" && !is_bad_data(v) && aggr_mcts.cts.nrows() != nint(v)) { mlog << Error << "\nread_aggr_mctc() -> " << "the number of MCTC categories do not match (" @@ -1618,8 +1622,12 @@ void read_aggr_pct(int n, const PCTInfo &pct_info, // Get the n-th value double v = aggr_data[var_name].buf()[n]; + // Store the number of pairs + if(c == "TOTAL" && !is_bad_data(v)) { + aggr_pct.pct.set_n_pairs(nint(v)); + } // Check the number of thresholds - if(c == "N_THRESH" && !is_bad_data(v) && + else if(c == "N_THRESH" && !is_bad_data(v) && (aggr_pct.pct.nrows()+1) != nint(v)) { mlog << Error << "\nread_aggr_pct() -> " << "the number of PCT thresholds do not match (" diff --git a/src/tools/core/stat_analysis/aggr_stat_line.cc b/src/tools/core/stat_analysis/aggr_stat_line.cc index 6c0a7add52..3c6dcd3f22 100644 --- a/src/tools/core/stat_analysis/aggr_stat_line.cc +++ b/src/tools/core/stat_analysis/aggr_stat_line.cc @@ -589,12 +589,15 @@ void aggr_summary_lines(LineDataFile &f, STATAnalysisJob &job, int &n_in, int &n_out) { STATLine line; AggrSummaryInfo aggr; - ConcatString key, cs; - StringArray sa, req_stat, req_lty, req_col; + ConcatString cs; + StringArray sa; + StringArray req_stat; + StringArray req_lty; + StringArray req_col; STATLineType lty; NumArray empty_na; - int i, n_add; - double v, w; + double v; + double w; // // Objects for derived statistics @@ -607,7 +610,7 @@ void aggr_summary_lines(LineDataFile &f, STATAnalysisJob &job, // // Build list of requested line types and column names // - for(i=0; i::iterator it; // @@ -860,7 +862,7 @@ void aggr_ctc_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -888,14 +890,7 @@ void aggr_ctc_lines(LineDataFile &f, STATAnalysisJob &job, // Increment counts in the existing map entry // else { - m[key].cts_info.cts.set_fy_oy(m[key].cts_info.cts.fy_oy() + - cur.cts.fy_oy()); - m[key].cts_info.cts.set_fy_on(m[key].cts_info.cts.fy_on() + - cur.cts.fy_on()); - m[key].cts_info.cts.set_fn_oy(m[key].cts_info.cts.fn_oy() + - cur.cts.fn_oy()); - m[key].cts_info.cts.set_fn_on(m[key].cts_info.cts.fn_on() + - cur.cts.fn_on()); + m[key].cts_info.cts += cur.cts; } // @@ -971,7 +966,8 @@ void aggr_ctc_lines(LineDataFile &f, STATAnalysisJob &job, // // Sort the valid times // - n = it->second.valid_ts.rank_array(n_ties); + int n_ties; + int n = it->second.valid_ts.rank_array(n_ties); if(n_ties > 0 || n != it->second.valid_ts.n()) { mlog << Error << "\naggr_ctc_lines() -> " @@ -1019,9 +1015,7 @@ void aggr_mctc_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrMCTCInfo aggr; MCTSInfo cur; - ConcatString key; unixtime ut; - int i, k, n, n_ties; map::iterator it; // @@ -1056,7 +1050,7 @@ void aggr_mctc_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -1093,8 +1087,8 @@ void aggr_mctc_lines(LineDataFile &f, STATAnalysisJob &job, // // Increment the counts // - for(i=0; isecond.valid_ts.rank_array(n_ties); + int n_ties; + int n = it->second.valid_ts.rank_array(n_ties); if(n_ties > 0 || n != it->second.valid_ts.n()) { mlog << Error << "\naggr_mctc_lines() -> " @@ -1199,9 +1194,7 @@ void aggr_pct_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrPCTInfo aggr; PCTInfo cur; - ConcatString key; unixtime ut; - int i, n, oy, on, n_ties; map::iterator it; // @@ -1236,7 +1229,7 @@ void aggr_pct_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -1256,45 +1249,8 @@ void aggr_pct_lines(LineDataFile &f, STATAnalysisJob &job, // Increment counts in the existing map entry // else { - - // - // The size of the contingency table must remain the same - // - if(m[key].pct_info.pct.nrows() != cur.pct.nrows()) { - mlog << Error << "\naggr_pct_lines() -> " - << "when aggregating PCT lines the number of " - << "thresholds must remain the same for all lines, " - << m[key].pct_info.pct.nrows() << " != " - << cur.pct.nrows() << "\n\n"; - throw 1; - } - - // - // Increment the counts - // - for(i=0; i " - << "when aggregating PCT lines the threshold " - << "values must remain the same for all lines, " - << m[key].pct_info.pct.threshold(i) << " != " - << cur.pct.threshold(i) << "\n\n"; - throw 1; - } - - oy = m[key].pct_info.pct.event_count_by_row(i); - on = m[key].pct_info.pct.nonevent_count_by_row(i); - - m[key].pct_info.pct.set_entry(i, nx2_event_column, - oy + cur.pct.event_count_by_row(i)); - m[key].pct_info.pct.set_entry(i, nx2_nonevent_column, - on + cur.pct.nonevent_count_by_row(i)); - } // end for i - } // end else + m[key].pct_info.pct += cur.pct; + } // // Keep track of scores for each time step for VIF @@ -1362,7 +1318,8 @@ void aggr_pct_lines(LineDataFile &f, STATAnalysisJob &job, // // Sort the valid times // - n = it->second.valid_ts.rank_array(n_ties); + int n_ties; + int n = it->second.valid_ts.rank_array(n_ties); if(n_ties > 0 || n != it->second.valid_ts.n()) { mlog << Error << "\naggr_pct_lines() -> " @@ -1399,9 +1356,7 @@ void aggr_psum_lines(LineDataFile &f, STATAnalysisJob &job, VL1L2Info cur_vl1l2; NBRCNTInfo cur_nbrcnt; CNTInfo cur_cnt; - ConcatString key; unixtime ut; - int n, n_ties; map::iterator it; // @@ -1467,7 +1422,7 @@ void aggr_psum_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -1563,7 +1518,8 @@ void aggr_psum_lines(LineDataFile &f, STATAnalysisJob &job, // // Sort the valid times // - n = it->second.valid_ts.rank_array(n_ties); + int n_ties; + int n = it->second.valid_ts.rank_array(n_ties); if(n_ties > 0 || n != it->second.valid_ts.n()) { mlog << Error << "\naggr_psum_lines() -> " @@ -1599,7 +1555,6 @@ void aggr_grad_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrGRADInfo aggr; GRADInfo cur; - ConcatString key; map::iterator it; // @@ -1630,7 +1585,7 @@ void aggr_grad_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -1689,8 +1644,10 @@ void aggr_wind_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrWindInfo aggr; VL1L2Info cur; - ConcatString key; - double uf, vf, uo, vo; + double uf; + double vf; + double uo; + double vo; // // Process the STAT lines @@ -1745,7 +1702,7 @@ void aggr_wind_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -1795,14 +1752,7 @@ void aggr_mpr_wind_lines(LineDataFile &f, STATAnalysisJob &job, AggrWindInfo aggr; VL1L2Info v_info; MPRData cur; - ConcatString hdr, key; - double uf, uo, ufcmn, ufcsd, uocmn, uocsd; - double vf, vo, vfcmn, vfcsd, vocmn, vocsd; - double fcst_wind, obs_wind; - double fcmn_wind, fcsd_wind; - double ocmn_wind, ocsd_wind; - bool is_ugrd; - int i; + ConcatString hdr; map::iterator it; // @@ -1819,19 +1769,19 @@ void aggr_mpr_wind_lines(LineDataFile &f, STATAnalysisJob &job, job.dump_stat_line(line); parse_mpr_line(line, cur); - is_ugrd = (cur.fcst_var == ugrd_abbr_str); - uf = (is_ugrd ? cur.fcst : bad_data_double); - uo = (is_ugrd ? cur.obs : bad_data_double); - ufcmn = (is_ugrd ? cur.fcst_climo_mean : bad_data_double); - ufcsd = (is_ugrd ? cur.fcst_climo_stdev : bad_data_double); - uocmn = (is_ugrd ? cur.obs_climo_mean : bad_data_double); - uocsd = (is_ugrd ? cur.obs_climo_stdev : bad_data_double); - vf = (is_ugrd ? bad_data_double : cur.fcst); - vo = (is_ugrd ? bad_data_double : cur.obs); - vfcmn = (is_ugrd ? bad_data_double : cur.fcst_climo_mean); - vfcsd = (is_ugrd ? bad_data_double : cur.fcst_climo_stdev); - vocmn = (is_ugrd ? bad_data_double : cur.obs_climo_mean); - vocsd = (is_ugrd ? bad_data_double : cur.obs_climo_stdev); + bool is_ugrd = (cur.fcst_var == ugrd_abbr_str); + double uf = (is_ugrd ? cur.fcst : bad_data_double); + double uo = (is_ugrd ? cur.obs : bad_data_double); + double ufcmn = (is_ugrd ? cur.fcst_climo_mean : bad_data_double); + double ufcsd = (is_ugrd ? cur.fcst_climo_stdev : bad_data_double); + double uocmn = (is_ugrd ? cur.obs_climo_mean : bad_data_double); + double uocsd = (is_ugrd ? cur.obs_climo_stdev : bad_data_double); + double vf = (is_ugrd ? bad_data_double : cur.fcst); + double vo = (is_ugrd ? bad_data_double : cur.obs); + double vfcmn = (is_ugrd ? bad_data_double : cur.fcst_climo_mean); + double vfcsd = (is_ugrd ? bad_data_double : cur.fcst_climo_stdev); + double vocmn = (is_ugrd ? bad_data_double : cur.obs_climo_mean); + double vocsd = (is_ugrd ? bad_data_double : cur.obs_climo_stdev); // // Build header string for matching UGRD and VGRD lines @@ -1860,7 +1810,7 @@ void aggr_mpr_wind_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -1907,6 +1857,7 @@ void aggr_mpr_wind_lines(LineDataFile &f, STATAnalysisJob &job, // // Add data for existing header entry // + int i; if(m[key].hdr_sa.has(hdr, i)) { // @@ -1989,7 +1940,7 @@ void aggr_mpr_wind_lines(LineDataFile &f, STATAnalysisJob &job, // // Loop over the pairs for the current map entry // - for(i=0; isecond.hdr_sa.n(); i++) { + for(int i=0; isecond.hdr_sa.n(); i++) { // // Check for missing UGRD data @@ -2020,18 +1971,18 @@ void aggr_mpr_wind_lines(LineDataFile &f, STATAnalysisJob &job, job.out_obs_wind_thresh.get_type() != thresh_na) { // Compute wind speeds - fcst_wind = convert_u_v_to_wind(it->second.pd_u.f_na[i], - it->second.pd_v.f_na[i]); - obs_wind = convert_u_v_to_wind(it->second.pd_u.o_na[i], - it->second.pd_v.o_na[i]); - fcmn_wind = convert_u_v_to_wind(it->second.pd_u.fcmn_na[i], - it->second.pd_v.fcmn_na[i]); - fcsd_wind = convert_u_v_to_wind(it->second.pd_u.fcsd_na[i], - it->second.pd_v.fcsd_na[i]); - ocmn_wind = convert_u_v_to_wind(it->second.pd_u.ocmn_na[i], - it->second.pd_v.ocmn_na[i]); - ocsd_wind = convert_u_v_to_wind(it->second.pd_u.ocsd_na[i], - it->second.pd_v.ocsd_na[i]); + double fcst_wind = convert_u_v_to_wind(it->second.pd_u.f_na[i], + it->second.pd_v.f_na[i]); + double obs_wind = convert_u_v_to_wind(it->second.pd_u.o_na[i], + it->second.pd_v.o_na[i]); + double fcmn_wind = convert_u_v_to_wind(it->second.pd_u.fcmn_na[i], + it->second.pd_v.fcmn_na[i]); + double fcsd_wind = convert_u_v_to_wind(it->second.pd_u.fcsd_na[i], + it->second.pd_v.fcsd_na[i]); + double ocmn_wind = convert_u_v_to_wind(it->second.pd_u.ocmn_na[i], + it->second.pd_v.ocmn_na[i]); + double ocsd_wind = convert_u_v_to_wind(it->second.pd_u.ocsd_na[i], + it->second.pd_v.ocsd_na[i]); // Store climo data ClimoPntInfo cpi(fcmn_wind, fcsd_wind, ocmn_wind, ocsd_wind); @@ -2094,8 +2045,12 @@ void aggr_mpr_wind_lines(LineDataFile &f, STATAnalysisJob &job, // ClimoPntInfo cpi; aggr.hdr_sa.add(it->second.hdr_sa[i]); + double uf; + double vf; convert_u_v_to_unit(it->second.pd_u.f_na[i], it->second.pd_v.f_na[i], uf, vf); + double uo; + double vo; convert_u_v_to_unit(it->second.pd_u.o_na[i], it->second.pd_v.o_na[i], uo, vo); aggr.pd_u.add_grid_pair(uf, uo, cpi, default_grid_weight); @@ -2119,7 +2074,6 @@ void aggr_mpr_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrMPRInfo aggr; MPRData cur; - ConcatString key; // // Process the STAT lines @@ -2157,7 +2111,7 @@ void aggr_mpr_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -2248,9 +2202,8 @@ void aggr_isc_lines(LineDataFile &ldf, STATAnalysisJob &job, STATLine line; AggrISCInfo aggr; ISCInfo cur; - ConcatString key; - int i, k, iscale; - double total, w, den, baser_fbias_sum; + int iscale; + double den; map::iterator it; // @@ -2289,7 +2242,7 @@ void aggr_isc_lines(LineDataFile &ldf, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -2398,20 +2351,20 @@ void aggr_isc_lines(LineDataFile &ldf, STATAnalysisJob &job, // Get the sum of the totals, compute the weight, and sum the // weighted scores // - for(i=0; isecond.isc_info.n_scale+2; i++) { + for(int i=0; isecond.isc_info.n_scale+2; i++) { // Total number of points for this scale - total = it->second.total_na[i].sum(); + double total = it->second.total_na[i].sum(); // Initialize - baser_fbias_sum = 0.0; + double baser_fbias_sum = 0.0; // Loop through all scores for this scale - for(k=0; ksecond.total_na[i].n(); k++) { + for(int k=0; ksecond.total_na[i].n(); k++) { // Compute the weight for each score to be aggregated // based on the number of points it represents - w = it->second.total_na[i][k]/total; + double w = it->second.total_na[i][k]/total; // Sum scores for the binary fields if(i == 0) { @@ -2507,8 +2460,7 @@ void aggr_ecnt_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrENSInfo aggr; ECNTData cur; - ConcatString key; - double crps_emp, crps_emp_fair, spread_md, crpscl_emp, crps_gaus, crpscl_gaus, v; + double v; map::iterator it; // @@ -2540,7 +2492,7 @@ void aggr_ecnt_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -2626,12 +2578,10 @@ void aggr_ecnt_lines(LineDataFile &f, STATAnalysisJob &job, v = it->second.mse_oerr_na.wmean(it->second.ens_pd.wgt_na); it->second.ens_pd.rmse_oerr = (is_bad_data(v) ? bad_data_double : sqrt(v)); - crps_emp = it->second.ens_pd.crps_emp_na.wmean(it->second.ens_pd.wgt_na); - crps_emp_fair = it->second.ens_pd.crps_emp_fair_na.wmean(it->second.ens_pd.wgt_na); - spread_md = it->second.ens_pd.spread_md_na.wmean(it->second.ens_pd.wgt_na); - crpscl_emp = it->second.ens_pd.crpscl_emp_na.wmean(it->second.ens_pd.wgt_na); - crps_gaus = it->second.ens_pd.crps_gaus_na.wmean(it->second.ens_pd.wgt_na); - crpscl_gaus = it->second.ens_pd.crpscl_gaus_na.wmean(it->second.ens_pd.wgt_na); + double crps_emp = it->second.ens_pd.crps_emp_na.wmean(it->second.ens_pd.wgt_na); + double crpscl_emp = it->second.ens_pd.crpscl_emp_na.wmean(it->second.ens_pd.wgt_na); + double crps_gaus = it->second.ens_pd.crps_gaus_na.wmean(it->second.ens_pd.wgt_na); + double crpscl_gaus = it->second.ens_pd.crpscl_gaus_na.wmean(it->second.ens_pd.wgt_na); // Compute aggregated empirical CRPSS it->second.ens_pd.crpss_emp = @@ -2656,7 +2606,6 @@ void aggr_rps_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrRPSInfo aggr; RPSInfo cur; - ConcatString key; map::iterator it; // @@ -2691,7 +2640,7 @@ void aggr_rps_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -2750,8 +2699,6 @@ void aggr_rhist_lines(LineDataFile &f, STATAnalysisJob &job, STATLine line; AggrENSInfo aggr; RHISTData cur; - ConcatString key; - int i; map::iterator it; // @@ -2783,14 +2730,14 @@ void aggr_rhist_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary // if(m.count(key) == 0) { aggr.clear(); - for(i=0; i::iterator it; // @@ -2873,7 +2818,7 @@ void aggr_phist_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -2906,7 +2851,7 @@ void aggr_phist_lines(LineDataFile &f, STATAnalysisJob &job, // // Aggregate the probability integral transform histogram counts // - for(i=0; i::iterator it; // @@ -2964,7 +2907,7 @@ void aggr_relp_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Add a new map entry, if necessary @@ -2996,7 +2939,7 @@ void aggr_relp_lines(LineDataFile &f, STATAnalysisJob &job, // // Aggregate the RELP histogram counts // - for(i=0; i::iterator it; // @@ -3056,7 +2996,7 @@ void aggr_orank_lines(LineDataFile &f, STATAnalysisJob &job, // // Build the map key for the current line // - key = job.get_case_info(line); + ConcatString key(job.get_case_info(line)); // // Skip missing data @@ -3073,10 +3013,10 @@ void aggr_orank_lines(LineDataFile &f, STATAnalysisJob &job, aggr.ens_pd.obs_error_flag = !is_bad_data(cur.ens_mean_oerr); aggr.ens_pd.set_ens_size(cur.n_ens); aggr.ens_pd.extend(cur.total); - for(i=0; i thresh(n); - for(i=0; i&m) { << it->second.Info.cts.fn_on() << " correct negatives.\n"; // Increment the counts for the existing key - RIRWMap[it->first].Info.cts.set_fy_oy( - RIRWMap[it->first].Info.cts.fy_oy() + - it->second.Info.cts.fy_oy()); - RIRWMap[it->first].Info.cts.set_fy_on( - RIRWMap[it->first].Info.cts.fy_on() + - it->second.Info.cts.fy_on()); - RIRWMap[it->first].Info.cts.set_fn_oy( - RIRWMap[it->first].Info.cts.fn_oy() + - it->second.Info.cts.fn_oy()); - RIRWMap[it->first].Info.cts.set_fn_on( - RIRWMap[it->first].Info.cts.fn_on() + - it->second.Info.cts.fn_on()); + RIRWMap[it->first].Info.cts += it->second.Info.cts; RIRWMap[it->first].Hdr.add_uniq(it->second.Hdr); RIRWMap[it->first].AModel.add_uniq(it->second.AModel);