Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suggest to update fetchOrbit.py and topsStack README.md #721

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
142 changes: 97 additions & 45 deletions contrib/stack/topsStack/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ The detailed algorithm for stack processing of TOPS data can be find here:

-----------------------------------

To use the sentinel stack processor, make sure to add the path of your `contrib/stack/topsStack` folder to your `$PATH` environment varibale.
To use the sentinel stack processor, make sure to add the path of your `contrib/stack/topsStack` folder to your `$PATH` environment varibale.

The scripts provides support for Sentinel-1 TOPS stack processing. Currently supported workflows include a coregistered stack of SLC, interferograms, offsets, and coherence.
The scripts provides support for Sentinel-1 TOPS stack processing. Currently supported workflows include a coregistered stack of SLC, interferograms, offsets, and coherence.

`stackSentinel.py` generates all configuration and run files required to be executed on a stack of Sentinel-1 TOPS data. When stackSentinel.py is executed for a given workflow (-W option) a **configs** and **run_files** folder is generated. No processing is performed at this stage. Within the run_files folder different run\_#\_description files are contained which are to be executed as shell scripts in the run number order. Each of these run scripts call specific configure files contained in the “configs” folder which call ISCE in a modular fashion. The configure and run files will change depending on the selected workflow. To make run_# files executable, change the file permission accordingly (e.g., `chmod +x run_01_unpack_slc`).

Expand All @@ -20,7 +20,7 @@ stackSentinel.py -h #To get an overview of all the configurable parameters
Required parameters of stackSentinel.py include:

```cfg
-s SLC_DIRNAME #A folder with downloaded Sentinel-1 SLC’s.
-s SLC_DIRNAME #A folder with downloaded Sentinel-1 SLC’s.
-o ORBIT_DIRNAME #A folder containing the Sentinel-1 orbits. Missing orbit files will be downloaded automatically
-a AUX_DIRNAME #A folder containing the Sentinel-1 Auxiliary files
-d DEM_FILENAME #A DEM (Digital Elevation Model) referenced to wgs84
Expand All @@ -32,7 +32,7 @@ In all workflows, coregistration (-C option) can be done using only geometry (se

#### AUX_CAL file download ####

The following calibration auxliary (AUX_CAL) file is used for **antenna pattern correction** to compensate the range phase offset of SAFE products with **IPF verison 002.36** (mainly for images acquired before March 2015). If all your SAFE products are from another IPF version, then no AUX files are needed. Check [ESA document](https://earth.esa.int/documents/247904/1653440/Sentinel-1-IPF_EAP_Phase_correction) for details.
The following calibration auxliary (AUX_CAL) file is used for **antenna pattern correction** to compensate the range phase offset of SAFE products with **IPF verison 002.36** (mainly for images acquired before March 2015). If all your SAFE products are from another IPF version, then no AUX files are needed. Check [ESA document](https://earth.esa.int/documents/247904/1653440/Sentinel-1-IPF_EAP_Phase_correction) for details.

The AUX_CAL file is available on [Sentinel-1 Mission Performance Center](https://sar-mpc.eu/ipf-adf/aux_cal/?sentinel1__mission=S1A&validity_start=2014&validity_start=2014-09&adf__active=True). We recommend download it using the web brower or the `wget` command below, and store it somewhere (_i.e._ ~/aux/aux_cal) so that you can use it all the time, for `stackSentinel.py -a` or `auxiliary data directory` in `topsApp.py`.

Expand Down Expand Up @@ -72,59 +72,73 @@ stackSentinel.py -s ../SLC/ -d ../DEM/demLat_N18_N20_Lon_W100_W097.dem.wgs84 -a

by running the command above, the configs and run_files folders are created. User needs to execute each run file in order. The order is specified by the index number of the run file name. For the example above, the run_files folder includes the following files:

- run_01_unpack_slc_topo_reference
- run_02_average_baseline
- run_03_extract_burst_overlaps
- run_04_overlap_geo2rdr_resample
- run_05_pairs_misreg
- run_06_timeseries_misreg
- run_07_geo2rdr_resample
- run_08_extract_stack_valid_region
- run_09_merge
- run_10_grid_baseline
- run_01_unpack_topo_reference
- run_02_unpack_secondary_slc
- run_03_average_baseline
- run_04_extract_burst_overlaps
- run_05_overlap_geo2rdr
- run_06_overlap_resample
- run_07_pairs_misreg
- run_08_timeseries_misreg
- run_09_fullBurst_geo2rdr
- run_10_fullBurst_resample
- run_11_extract_stack_valid_region
- run_12_merge_reference_secondary_slc
- run_13_grid_baseline

The generated run files are self descriptive. Below is a short explanation on what each run_file does:

**run_01_unpack_slc_topo_reference:**

Includes commands to unpack Sentinel-1 TOPS SLCs using ISCE readers. For older SLCs which need antenna elevation pattern correction, the file is extracted and written to disk. For newer version of SLCs which don’t need the elevation antenna pattern correction, only a gdal virtual “vrt” file (and isce xml file) is generated. The “.vrt” file points to the Sentinel SLC file and reads them whenever required during the processing. If a user wants to write the “.vrt” SLC file to disk, it can be done easily using gdal_translate (e.g. gdal_translate –of ENVI File.vrt File.slc).
The “run_01_unpack_slc_topo_reference” also includes a command that refers to the config file of the stack reference, which includes configuration for running topo for the stack reference. Note that in the pair-wise processing strategy one should run topo (mapping from range-Doppler to geo coordinate) for all pairs. However, with stackSentinel, topo needs to be run only one time for the reference in the stack.
Includes a command that refers to the config file of the stack reference, which includes configuration for running `topo` for the stack reference. Note that in the pair-wise processing strategy, one should run `topo` (mapping from range-Doppler to geo coordinate) for all pairs. However, with `stackSentinel.py`, `topo` needs to be run only one time for the reference in the stack. This stage will also unpack Sentinel-1 TOPS reference SLC. Reference geometry files are saved under `geom_reference/`. Reference burst SLCs are saved under `reference/`.

**run_02_average_baseline:**
**run_02_unpack_secondary_slc:**

Computes average baseline for the stack. These baselines are not used for processing anywhere. They are only an approximation and can be used for plotting purposes. A more precise baseline grid is estimated later in run_10.
Unpack secondary Sentinel-1 TOPS SLCs using ISCE readers. For older SLCs which need antenna elevation pattern correction, the file is extracted and written to disk. For newer version of SLCs which don’t need the elevation antenna pattern correction, only a gdal virtual “vrt” file (and isce xml file) is generated. The “.vrt” file points to the Sentinel SLC file and reads them whenever required during the processing. If a user wants to write the “.vrt” SLC file to disk, it can be done easily using `gdal_translate` (e.g. `gdal_translate –of ENVI File.vrt File.slc`). Secondary burst SLCs are saved under `secondarys/`.

**run_03_extract_burst_overlaps:**
**run_03_average_baseline:**

Burst overlaps are extracted for estimating azimuth misregistration using NESD technique. If coregistration method is chosen to be “geometry”, then this run file won’t exist and the overlaps are not extracted.
Computes average baseline for the stack, saved under `baselines/`. These baselines are not used for processing anywhere. They are only an approximation and can be used for plotting purposes. A more precise baseline grid is estimated later in `run_13_grid_baseline` only for `-W slc` workflow.

**run_04_overlap_geo2rdr_resample:***
**run_04_extract_burst_overlaps:**

Running geo2rdr to estimate geometrical offsets between secondary burst overlaps and the stack reference burst overlaps. The secondary burst overlaps are then resampled to the stack reference burst overlaps.
Burst overlaps are extracted for estimating azimuth misregistration using the [NESD technique](https://ieeexplore.ieee.org/document/7637021). If coregistration method is chosen to be “geometry”, then this run file won’t exist and the overlaps are not extracted. Saved under `reference/overlap/` and `geom_reference/overlap`.

**run_05_pairs_misreg:**
**run_05_overlap_geo2rdr:**

Using the coregistered stack burst overlaps generated from the previous step, differential overlap interferograms are generated and are used for estimating azimuth misregistration using Enhanced Spectral Diversity (ESD) technique.
Running geo2rdr to estimate geometrical offsets between secondary burst overlaps (`secondary/`) and the stack reference (`reference`) burst overlaps. Saved under `coreg_secondarys/YYYYMMDD/overlap`.

**run_06_timeseries_misreg:**
**run_06_overlap_resample:**

A time-series of azimuth and range misregistration is estimated with respect to the stack reference. The time-series is a least squares esatimation from the pair misregistration from the previous step.
The secondary burst overlaps are then resampled to the stack reference burst overlaps. Saved under `coreg_secondarys/YYYYMMDD/overlap`.

**run_07_geo2rdr_resample:**
**run_07_pairs_misreg:**

Using orbit and DEM, geometrical offsets among all secondary SLCs and the stack reference is computed. The goometrical offsets, together with the misregistration time-series (from previous step) are used for precise coregistration of each burst SLC.
Using the coregistered stack burst overlaps generated from the previous step, differential overlap interferograms are generated and are used for estimating azimuth misregistration using Enhanced Spectral Diversity (ESD) technique. Saved under `misreg/azimuth/pairs/` and `misreg/range/pairs/`.

**run_08_extract_stack_valid_region:**
**run_08_timeseries_misreg:**

The valid region between burst SLCs at the overlap area of the bursts slightly changes for different acquisitions. Therefore we need to keep track of these overlaps which will be used during merging bursts. Without these knowledge, lines of invalid data may appear in the merged products at the burst overlaps.
A time-series of azimuth and range misregistration is estimated with respect to the stack reference. The time-series is a least-squares estimation from the pair misregistration from the previous step. Saved under `misreg/azimuth/dates/` and `misreg/range/dates/`.

**run_09_merge:**
**run_09_fullBurst_geo2rdr:**

Merges all bursts for the reference and coregistered SLCs. The geometry files are also merged including longitude, latitude, shadow and layer mask, line-of-sight files, etc. .
Using orbit and DEM, geometrical offsets among all secondary SLCs and the stack reference is computed. Saved under `coreg_secondarys/`.

**run_10_grid_baseline:**
**run_10_fullBurst_resample:**

A coarse grid of baselines between each secondary SLC and the stack reference is generated. This is not used in any computation.
The geometrical offsets, together with the misregistration time-series (from the previous step) are used for precise coregistration of each burst SLC by resampling to the stack reference burst SLC. Saved under `coreg_secondarys/`.

**run_11_extract_stack_valid_region:**

The valid region between burst SLCs at the overlap area of the bursts slightly changes for different acquisitions. Therefore, we need to keep track of these overlaps which will be used during merging bursts. Without these knowledges, lines of invalid data may appear in the merged products at the burst overlaps.

**run_12_merge_reference_secondary_slc:**

Merges all bursts for the reference and coregistered SLCs and apply multilooking to form full-scene SLCs (saved under `merged/SLC/` if --virtual_merge is True). The geometry files are also merged including longitude, latitude, shadow and layer mask, line-of-sight files, etc. under `merged/geom_reference/`.

**run_13_grid_baseline:**

A coarse grid of baselines between each secondary SLC and the stack reference is generated. This is not used in any computation. Saved under `merged/baselines/`.

#### 4.2 Example workflow: Coregistered stack of SLC with modified parameters ####

Expand All @@ -145,14 +159,44 @@ In this example, a stack of interferograms is requested for which up to 2 neares
stackSentinel.py -s ../SLC/ -d ../../MexicoCity/demLat_N18_N20_Lon_W100_W097.dem.wgs84 -b '19 20 -99.5 -98.5' -a ../../AuxDir/ -o ../../Orbits -c 2
```

In the following example, all possible interferograms are being generated and in which the coregistration approach is set to use geometry and not the default NESD.
In the following example, all possible interferograms are being generated and in which the coregistration approach is set to use geometry and not the default NESD.

```
stackSentinel.py -s ../SLC/ -d ../../MexicoCity/demLat_N18_N20_Lon_W100_W097.dem.wgs84 -b '19 20 -99.5 -98.5' -a ../../AuxDir/ -o ../../Orbits -C geometry -c all
```

When executing all the run files, a coregistered stack of slcs are produced, the burst interferograms are generated and then merged. Merged interferograms are multilooked, filtered and unwrapped. Geocoding is not applied. If users need to geocode any product, they can use the geocodeGdal.py script.

Compared to the "Coregistered stack of SLC workflow" (`-W SLC`),

~~**run_13_grid_baseline:**~~ This step does not exist in `-W interferogram` workflow.

But additional run_files are created in the `-W interferogram` workflow,

- run_13_generate_burst_igram
- run_14_merge_burst_igram
- run_15_filter_coherence
- run_16_unwrap

Below is a short explanation on what each run_file does:

**run_13_generate_burst_igram:**

Take the stack of coregistered burst SLCs (`reference` and `coreg_secondary`) to generate burst interferograms. These burst-level interferograms are saved under `interferograms/`.

**run_14_merge_burst_igram:**

Merge the burst interferograms and apply multilooking to form a full-scene interferogram for each acquisition. Saved under `merged/interferograms/fine.int`

**run_15_filter_coherence:**

Use the full-scene SLCs in `merged/SLC/` to generate the complex coherence. Apply filtering to the full-scene interferograms and the coherence files. These files are saved as `fine.cor`, `filt_fine.int`, `filt_fine.cor` under `merged/interferograms/`.

**run_16_unwrap:**

Apply unwrapping to the multilooked and filtered interferograms `merged/interferograms/filt_fine.int`, generate the unwrapped files, `merged/interferograms/filt_fine.unw`.


#### 4.4 Example workflow: Stack of correlation ####

Generate the run and configure files needed to generate a stack of coherence.
Expand Down Expand Up @@ -214,6 +258,10 @@ If ionospheric phase estimation is enabled in stackSentinel.py, it will generate
- run_ns+6_computeIon
- run_ns+7_filtIon
- run_ns+8_invertIon
- run_ns+9_filtIonShift
- run_ns+10_invertIonShift
- run_ns+11_burstRampIon
- run_ns+12_mergeBurstRampIon

Note about **'areas masked out in ionospheric phase estimation'** in ion_param.txt. Seperated islands or areas usually lead to phase unwrapping errors and therefore significantly affect ionospheric phase estimation. It's better to mask them out. Check ion/date1_date2/ion_cal/raw_no_projection.ion for areas to be masked out. However, we don't have this file before processing the data. To quickly get this file, we can process a stack of two acquistions to get this file. NOTE that the reference of this two-acquisition stack should be the same as that of the full stack we want to process.

Expand Down Expand Up @@ -259,19 +307,23 @@ Run the commands sequentially.

Results from ionospheric phase estimation.

- reference and coreg_secondarys: now contains also subband burst SLCs
- ion: original ionospheric phase estimation results
- ion_dates: ionospheric phase for each acquistion
- ion/date1_date2/ion_cal/filt.ion: filtered ionospheric phase
- ion/date1_date2/ion_cal/raw_no_projection.ion: original ionospheric phase
- ion/date1_date2/lower/merged/fine_look.unw: unwrapped lower band interferogram
- ion/date1_date2/upper/merged/fine_look.unw: unwrapped upper band interferogram
- `reference` and `coreg_secondarys`: now contains also subband burst SLCs
- `ion`: original ionospheric phase estimation results
- `date1_date2/ion_cal/azshift.ion`: azimuth ionospheric shift
- `date1_date2/ion_cal/filt.ion`: filtered ionospheric phase
- `date1_date2/ion_cal/raw_no_projection.ion`: original ionospheric phase
- `date1_date2/lower/merged/fine_look.unw`: unwrapped lower band interferogram
- `date1_date2/upper/merged/fine_look.unw`: unwrapped upper band interferogram
- `ion_azshift_dates`: azimuth ionospheric shift for each acquistion
- `ion_burst_ramp_dates`: azimuth burst ramps caused by ionosphere for each acquistion
- `ion_burst_ramp_merged_dates`: merged azimuth burst ramps caused by ionosphere for each acquistion
- `ion_dates`: ionospheric phase for each acquistion

If ionospheric phase estimation processing is swath by swath because of different swath starting ranges, there will be swath processing directories including

- ion/date1_date2/ion_cal_IW*
- ion/date1_date2/lower/merged_IW*
- ion/date1_date2/upper/merged_IW*
- `ion/date1_date2/ion_cal_IW*`
- `ion/date1_date2/lower/merged_IW*`
- `ion/date1_date2/upper/merged_IW*`

After processing, we can plot ionospheric phase estimation results using plotIonPairs.py and plotIonDates.py. For example

Expand Down
32 changes: 28 additions & 4 deletions contrib/stack/topsStack/fetchOrbit.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@
import requests
import re
import os
import sys
import glob
import argparse
import datetime
from html.parser import HTMLParser
Expand All @@ -27,8 +29,10 @@ def cmdLineParse():
'''

parser = argparse.ArgumentParser(description='Fetch orbits corresponding to given SAFE package')
parser.add_argument('-i', '--input', dest='input', type=str, required=True,
parser.add_argument('-i', '--input', dest='input', type=str, default=None,
help='Path to SAFE package of interest')
parser.add_argument('-d', '--indir', dest='indir', type=str, default=None,
help='Directory to SAFE package(s) of interest')
parser.add_argument('-o', '--output', dest='outdir', type=str, default='.',
help='Path to output directory')

Expand Down Expand Up @@ -120,13 +124,15 @@ def fileToRange(fname):
return (start, stop, mission)


if __name__ == '__main__':
def run_main(inps, input_file=None):
'''
Main driver.
Run the major thing
'''

inps = cmdLineParse()
if input_file:
inps.input = input_file

print('Fetching for: ', inps.input)
fileTS, satName, fileTSStart = FileToTimeStamp(inps.input)
print('Reference time: ', fileTS)
print('Satellite name: ', satName)
Expand Down Expand Up @@ -156,6 +162,7 @@ def fileToRange(fname):

if match is not None:
success = True
print('fetch success, orbit type: ', oType)
except:
pass

Expand All @@ -169,3 +176,20 @@ def fileToRange(fname):
print('Failed to download URL: ', match)
else:
print('Failed to find {1} orbits for tref {0}'.format(fileTS, satName))


if __name__ == '__main__':
'''
Main driver.
'''

inps = cmdLineParse()
if (inps.input is None) and (inps.indir is None):
sys.exit('Both input files and input folder is missing!')

if inps.indir:
input_files = glob.glob(os.path.join(inps.indir, '*.zip'))
for infile in input_files:
run_main(inps, input_file=infile)
else:
run_main(inps)