Skip to content

Fix error for PT backend when pytorch.distributed is not available #6838

Fix error for PT backend when pytorch.distributed is not available

Fix error for PT backend when pytorch.distributed is not available #6838

Triggered via pull request April 7, 2024 06:50
Status Cancelled
Total duration 56s
Artifacts

build_cc.yml

on: pull_request
Matrix: Build C++
Pass building C++
1s

Annotations

11 errors
Build C++ (cuda120, cuda)
Canceling since a higher priority waiting request for 'Build C++-refs/pull/3652/merge' exists
Build C++ (cuda120, cuda)
The operation was canceled.
Build C++ (clang, clang)
Canceling since a higher priority waiting request for 'Build C++-refs/pull/3652/merge' exists
Build C++ (clang, clang)
The operation was canceled.
Build C++ (cuda, cuda)
Canceling since a higher priority waiting request for 'Build C++-refs/pull/3652/merge' exists
Build C++ (cuda, cuda)
The operation was canceled.
Build C++ (rocm, rocm)
Canceling since a higher priority waiting request for 'Build C++-refs/pull/3652/merge' exists
Build C++ (rocm, rocm)
The operation was canceled.
Build C++ (cpu, cpu)
Canceling since a higher priority waiting request for 'Build C++-refs/pull/3652/merge' exists
Build C++ (cpu, cpu)
The operation was canceled.
Pass building C++
Process completed with exit code 1.