Skip to content

Commit

Permalink
paper submission
Browse files Browse the repository at this point in the history
  • Loading branch information
gmoss13 committed Dec 2, 2023
1 parent 8541f0c commit a37f82d
Show file tree
Hide file tree
Showing 10 changed files with 809 additions and 139 deletions.
661 changes: 661 additions & 0 deletions LICENSE

Large diffs are not rendered by default.

18 changes: 9 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Simulation-Based Inference of Surface Accumulation and Basal Melt Rates of Antarctic Ice Shelves from Isochronal Layers
# Simulation-Based Inference of Surface Accumulation and Basal Melt Rates of an Antarctic Ice Shelf from Isochronal Layers

This repository contains research code for Simulation-based inference of surface accumulation and basal melt rates of Antarctic Ice Shelves from Isochronal Layers [preprint link]. It contains scripts for the forward model, along with the inference workflow. For code used to preprocess ice shelf data, and generate synthetic ice shelves using [firedrake](https://www.firedrakeproject.org) and [icepack](https://icepack.github.io) - see [preprocessing_ice_data](https://github.com/mackelab/preprocessing_ice_data/).
This repository contains research code for "Simulation-Based Inference of Surface Accumulation and Basal Melt Rates of an Antarctic Ice Shelf from Isochronal Layers". It contains scripts for the forward model, along with the inference workflow. For code used to preprocess ice shelf data, and generate synthetic ice shelves using [firedrake](https://www.firedrakeproject.org) and [icepack](https://icepack.github.io) - see [preprocessing_ice_data](https://github.com/mackelab/preprocessing_ice_data/).
Maintenance is ongoing! Expect improvements in terms of extended functionality and improved usability in the future.

Some small data and output files are already included in this repository for quicker start-up. Some parts of the workflow require bigger files. These will clearly be marked in the **workflow** section, along with links to where these files can be downloaded.
The picked IRH data for Ekström Ice Shelf can be found [here](https://doi.org/doi:10.5281/zenodo.10231043).
The picked IRH data for Ekström Ice Shelf can be found [here](https://nc-geophysik.guz.uni-tuebingen.de/index.php/s/wH5zqPaBGZAPRyD).

## Installation
Activate a virtual environment, e.g. using conda. Install dependencies:
Expand All @@ -22,7 +22,7 @@ The full workflow (from running the simulator to evaluating posterior prediticti
2. Allocate some simulations as calibration simulations by moving them under `out/$shelf$/exp/calib_sims`
- `sbi_ice/runs/calibrate_sims.py` to find anomalies and layer bounds on `calib_sims` (make sure correct folder is used in `calibrate_sims.py`). If noise model needs fitting. Also set `noise_dist_overwrite = True`.
- `sbi_ice/runs/calibrate_sims.py` to find anomalies in `layer_sims` (check correct folder is used in `calibrate_sims.py` and that `layer_bounds_overwrite`, `noise_dist_overwrite` are set to `False`). This can be done in parallel across many jobs using `sbi_ice/runs/submit_calibrate_sims.sh`
3. Add noise to all layers and select best fitting layer for each simulation in each job, using `sbi_ice/runs/select_layers.py`(can be done in parallel using `sbi_ice/runs/submit_select_layers_local.sh`). If simulations done across many jobs, concatenate all outputs of step 3. using `notebooks/workflow/combine_pickles.ipynb`
3. Add noise to all layers and select best fitting layer for each simulation in each job, using `sbi_ice/runs/select_layers.py`(can be done in parallel using `sbi_ice/runs/select_all_layers_local.sh` or `sbi_ice/runs/select_all_layers_slurm.sh`). If simulations done across many jobs, concatenate all outputs of step 3. using `notebooks/workflow/combine_pickles.ipynb`
4. Train a Neural Posterior Estimation (NPE) density estimator using `sbi_ice/runs/train_NPE.py`. This can be parallelized in hydra by running `python train_NPE.py -m`.
5. Perform posterior predictive simulations using `sbi_ice/runs/post_predictive.py`. This can be parallelized in hydra by running `python post_predictive.py -m`.

Expand All @@ -47,16 +47,16 @@ First, we look at the calibration simulations. We find (and ignore) any anomalou
exp
gt_version
anomalies_overwrite = True
layer_bounds_overwrite = True
noise_dist_overwrite = True/False
sim_or_calib = "layer_sims"/"calib_sims"
layer_bounds_overwrite = True for "calib_sims" / False for "layer_sims"
noise_dist_overwrite = True for "calib_sims" / False for "layer_sims"
sim_or_calib = "calib_sims"/"layer_sims"
```

We then find any anomalous results in `.layer_sims`. Keep everything the same, and make sure `layer_bounds_overwrite=False` and `noise_dist_overwite = False`. This can be sped up given access to multiple CPU cores by running `sbatch submit_calibrate_sims.sh` on SLURM or `source submit_calibrate_sims.sh` locally.

### 3. Noise model and best fitting layer selection

Our inference workflow requires only one layer out of each simulation to be the observed value, and so we select the best fitting layer (by MSE). This is done for all simulations in a single job using the script `sbi_ice/runs/select_layers.py`. This can be parallelized across all the simulation jobs using `sbi_ice_runs/submit_select_layers_local.sh` or `sbi_ice/runs/submit_select_layers_slurm.sh`
Our inference workflow requires only one layer out of each simulation to be the observed value, and so we select the best fitting layer (by MSE). This is done for all simulations in a single job using the script `sbi_ice/runs/select_layers.py`. This can be parallelized across all the simulation jobs using `sbi_ice_runs/select_all_layers_local.sh` or `sbi_ice/runs/select_all_layers_slurm.sh`
The rest of the workflow takes in one processed layer file for all simulations - the per-job files can be combined and saved into one file using `notebooks/workflow/combine_pickles.ipynb`

To reproduce the results reported in the paper, you can [download the posteriors and processed simulation data](https://doi.org/doi:10.5281/zenodo.10245153), and save the respective files in `out`.
Expand All @@ -70,4 +70,4 @@ The trained posteriors for our experiments can be found in [link to data reposit

### 5. Evaluating Posterior Predictive Distributions

Given a trained posterior density estimator, we can sample from this posterior and simulate using these samples with the script `sbi_ice/runs/post_predictive.py`. Configurations are found under `configs/training`. The script is run with `hydra` using `python train_NPE.py -m` to submit several jobs for different layer numbers and/or different random seeds.
Given a trained posterior density estimator, we can sample from this posterior and simulate using these samples with the script `sbi_ice/runs/post_predictive.py`. Configurations are found under `configs/training`. The script is run with `hydra` using `python train_NPE.py -m` to submit several jobs for different layer numbers and/or different random seeds.
27 changes: 13 additions & 14 deletions notebooks/paper_figures/Lagrangian_MB.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -36,10 +36,9 @@
"metadata": {},
"outputs": [],
"source": [
"print(Path(data_dir,\"Ekstrom\",\"IRH_data\",\"LayerData_flowline_all.mat\"))\n",
"mat = scipy.io.loadmat(Path(data_dir,\"Ekstrom\",\"IRH_data\",\"LayerData_flowline_all.mat\"))\n",
"shelf_mask = scipy.io.loadmat(Path(data_dir,\"Ekstrom\",\"IRH_data\",\"LayerData_flowline_mask.mat\"))['save_in'].astype(bool).flatten()\n",
"shelf_mask = mat[\"save_in\"][0].astype(bool)\n"
"print(Path(data_dir,\"Ekstrom\",\"IRH_data\",\"Ekstroem_flowline_GPR_IRH_data.mat\"))\n",
"mat = scipy.io.loadmat(Path(data_dir,\"Ekstrom\",\"IRH_data\",\"Ekstroem_flowline_GPR_IRH_data.mat\"))\n",
"shelf_mask = scipy.io.loadmat(Path(data_dir,\"Ekstrom\",\"IRH_data\",\"LayerData_flowline_mask.mat\"))['save_in'].astype(bool).flatten()\n"
]
},
{
Expand All @@ -55,19 +54,19 @@
"metadata": {},
"outputs": [],
"source": [
"distance = mat[\"Distance\"].flatten()\n",
"distance = mat[\"distance\"].flatten()\n",
"gl_pos = distance[shelf_mask][0]\n",
"\n",
"xmin,xmax = 100,125\n",
"xmask = (distance>xmin) & (distance<xmax)\n",
"\n",
"fig,ax = plt.subplots(1,1,figsize=(5,5))\n",
"for i in range(0,mat[\"layers_elevation\"].shape[0]):\n",
" ax.plot(distance[xmask],mat[\"layers_elevation\"][i,xmask],color = color_opts[\"colors\"][\"observation\"])\n",
"base = mat[\"SurfaceElevation\"].flatten() - mat[\"BedmachineThickness\"].flatten()\n",
"for i in range(0,mat[\"elev_layers\"].shape[0]):\n",
" ax.plot(distance[xmask],mat[\"elev_layers\"][i,xmask],color = color_opts[\"colors\"][\"observation\"])\n",
"base = mat[\"elev_sur_bedmachine\"].flatten() - mat[\"thickness_bedmachine\"].flatten()\n",
"\n",
"#Also plot surface elevation\n",
"ax.plot(distance[xmask],mat[\"SurfaceElevation\"].flatten()[xmask],color=\"k\",linewidth=2)"
"ax.plot(distance[xmask],mat[\"elev_sur_bedmachine\"].flatten()[xmask],color=\"k\",linewidth=2)"
]
},
{
Expand Down Expand Up @@ -182,15 +181,15 @@
"ax3 = axs[\"c\"]\n",
"geom_mask = geom.x<25e3\n",
"\n",
"ax1.plot(distance[xmask],mat[\"SurfaceElevation\"].flatten()[xmask],color=\"k\",linewidth=2)\n",
"for i in range(0,mat[\"layers_elevation\"].shape[0]):\n",
" ax1.plot(distance[xmask],mat[\"layers_elevation\"][i,xmask],color = color_opts[\"colors\"][\"observation\"])\n",
"ax1.fill_between(distance[xmask],-100.0*np.ones_like(distance[xmask]),mat[\"SurfaceElevation\"].flatten()[xmask],color=\"k\",alpha=0.075,linewidth=0.0)\n",
"ax1.plot(distance[xmask],mat[\"elev_sur_bedmachine\"].flatten()[xmask],color=\"k\",linewidth=2)\n",
"for i in range(0,mat[\"elev_layers\"].shape[0]):\n",
" ax1.plot(distance[xmask],mat[\"elev_layers\"][i,xmask],color = color_opts[\"colors\"][\"observation\"])\n",
"ax1.fill_between(distance[xmask],-100.0*np.ones_like(distance[xmask]),mat[\"elev_sur_bedmachine\"].flatten()[xmask],color=\"k\",alpha=0.075,linewidth=0.0)\n",
"ax1.spines[\"bottom\"].set_bounds(distance[xmask][0]-0.001,distance[xmask][-1])\n",
"ax1.set_xlabel(\"Distance from GL [km]\")\n",
"ax1.set_ylabel(\"Elevation [m.a.s.l.]\")\n",
"ax1.annotate(r\"$v_{x}$\",xy=(115,41),xytext=(110,40),arrowprops=dict(arrowstyle=\"->\"))\n",
"ax1.vlines(116.9,-90,(mat[\"SurfaceElevation\"].flatten()[xmask])[np.argmin(np.abs(distance[xmask]-116.9))],linestyle=\"--\",color=\"k\")\n",
"ax1.vlines(116.9,-90,(mat[\"elev_sur_bedmachine\"].flatten()[xmask])[np.argmin(np.abs(distance[xmask]-116.9))],linestyle=\"--\",color=\"k\")\n",
"ax1.set_ylim(-90,45)\n",
"\n",
"\n",
Expand Down
Loading

0 comments on commit a37f82d

Please sign in to comment.