Skip to content

Commit

Permalink
fixes default arg overwriting config value in load of Pipeline
Browse files Browse the repository at this point in the history
  • Loading branch information
rudolfix committed Sep 9, 2024
1 parent 864f642 commit 5314949
Show file tree
Hide file tree
Showing 3 changed files with 2 additions and 5 deletions.
3 changes: 1 addition & 2 deletions dlt/helpers/airflow_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,8 +81,7 @@ def __init__(
The `data_folder` is available in certain Airflow deployments. In case of Composer, it is a location on the gcs bucket. `use_data_folder` is disabled and should be
enabled only when needed. The operations on bucket are non-atomic and way slower than on local storage and should be avoided.
`abort_task_if_any_job_failed` will abort the other dlt loading jobs and fail the Airflow task in any of the jobs failed. This may put your warehouse in
inconsistent state. See https://dlthub.com/docs/running-in-production/running#handle-exceptions-failed-jobs-and-retry-the-pipeline.
`abort_task_if_any_job_failed` will abort the other dlt loading jobs and fail the Airflow task in any of the jobs failed. See https://dlthub.com/docs/running-in-production/running#handle-exceptions-failed-jobs-and-retry-the-pipeline.
The load info and trace info can be optionally saved to the destination. See https://dlthub.com/docs/running-in-production/running#inspect-and-save-the-load-info-and-trace
Expand Down
2 changes: 1 addition & 1 deletion dlt/pipeline/pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -550,7 +550,7 @@ def load(
credentials: Any = None,
*,
workers: int = 20,
raise_on_failed_jobs: bool = False,
raise_on_failed_jobs: bool = ConfigValue,
) -> LoadInfo:
"""Loads the packages prepared by `normalize` method into the `dataset_name` at `destination`, optionally using provided `credentials`"""
# set destination and default dataset if provided (this is the reason we have state sync here)
Expand Down
2 changes: 0 additions & 2 deletions tests/pipeline/test_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -832,7 +832,6 @@ def test_run_with_table_name_exceeding_path_length() -> None:

def test_raise_on_failed_job() -> None:
os.environ["FAIL_PROB"] = "1.0"
os.environ["RAISE_ON_FAILED_JOBS"] = "true" # TODO: why is this necessary?
pipeline_name = "pipe_" + uniq_id()
p = dlt.pipeline(pipeline_name=pipeline_name, destination="dummy")
with pytest.raises(PipelineStepFailed) as py_ex:
Expand Down Expand Up @@ -951,7 +950,6 @@ def fail_extract():
assert py_ex.value.step == "extract"

os.environ["COMPLETED_PROB"] = "0.0"
os.environ["RAISE_ON_FAILED_JOBS"] = "true" # TODO: why is this necessary?
os.environ["FAIL_PROB"] = "1.0"
with pytest.raises(PipelineStepFailed) as py_ex:
for attempt in Retrying(
Expand Down

0 comments on commit 5314949

Please sign in to comment.