Skip to content

Commit

Permalink
Bug 1891815 - Increase global task timeout to two hours
Browse files Browse the repository at this point in the history
We have bitrisescript tasks that take longer than an hour and it looks
like we currently have no way to set a per script timeout (see the
comment about json-e in scriptworker.yml).

So the quick fix here is to bump the global timeout. I believe it's
extremely rare (if not impossible) for us to hit this timeout anyway,
because Kubernetes has its own timeout (`terminationGracePeriodSeconds`)
that is much shorter than this one. The only reason I even knew this
timeout exists is because I had set the above to `7200` for
bitrisescript. All other scripts have this value set to `3600` or less,
which means they should get force killed by k8s long before we ever
reach the scriptworker timeout.
  • Loading branch information
ahal committed Apr 18, 2024
1 parent a921301 commit d43ac3c
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docker.d/init.sh
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ if [ "$ENV" == "prod" ]; then
fi
export TASK_CONFIG=$CONFIG_DIR/worker.json
export TASK_LOGS_DIR=$ARTIFACTS_DIR/public/logs
export TASK_MAX_TIMEOUT=3600
export TASK_MAX_TIMEOUT=7200
export TASK_SCRIPT=$APP_DIR/bin/${PROJECT_NAME}script
export TEMPLATE_DIR=$APP_DIR/docker.d
export VERBOSE=true
Expand Down
2 changes: 1 addition & 1 deletion docker.d/scriptworker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ credentials:
# artifact_upload_timeout: { "$eval": "ARTIFACT_UPLOAD_TIMEOUT" }
# task_max_timeout: { "$eval": "TASK_MAX_TIMEOUT" }
artifact_upload_timeout: 1200
task_max_timeout: 3600
task_max_timeout: 7200
task_script:
- { "$eval": "TASK_SCRIPT" }
- { "$eval": "TASK_CONFIG" }
Expand Down

0 comments on commit d43ac3c

Please sign in to comment.