-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test s3 ingest through ftp-ingest directory #91
Comments
I have added a more user-friendly interface to the metadata db (more detail @ USF-IMARS/imars-etl#34). This might make step (4) above a bit easier.
|
We're getting closer to finishing step 2. I'm still having
I would try this myself, but I don't have permission, to change data.py, line 253 to |
Oh hey, I think that is the longer filename pattern (called Also I have fixed the permissions on imars-etl so you should be able to edit now. |
Oh also note that you can try parts of that command out on userproc:
oh yeah and imars-etl has a
|
As discussed earlier today the next step to automating s3 processing is to make sure
.SEN3
files load properly via the ftp ingest DAG.Specifically, in that file the
BashOperator
ingest_s3a_ol_1_efr
loads all files in/srv/imars-objects/ftp-ingest/fl_sen3/
that matchS3A_OL_1_EFR___*.SEN3
using theimars_etl.load
command.find
andxargs
are used to list all files & split them up, andmv
is used to move the file to trash after it is loaded.steps
Here are the steps to test this:
/srv/imars-objects/ftp-ingest/fl_sen3
. This file should cover some of Florida (30n, 24s, -80e, -84w).ftp_ingest_na
on imars-airflow-testmysql --host imars-sql-hydra --port 3306 --database imars_product_metadata -u imars -p
SELECT * FROM file WHERE product_id=36 AND date_time='$FILE_DT';
where$FILE_DT
is as described below.imars_etl.extract
imars-etl extract -v "product_id=36 AND date_time='$FILE_DT'"
where$FILE_DT
is as described belowdiff
,sha1sum
or your preferred method.$FILE_DT
$FILE_DT should be the datetime of the granule you are testing in a format like: '2018-06-22 16:25:25' for
/srv/imars-objects/gom/s3a_ol_1_efr/S3A_OL_1_EFR____20180622T162525.SEN3
.once this is verified working we can move on to ensuring the s3 processing is correct.
The text was updated successfully, but these errors were encountered: