Skip to content

Commit

Permalink
fix(pt): set device for PT C++ (deepmodeling#4261)
Browse files Browse the repository at this point in the history
Fix deepmodeling#4171.

<!-- This is an auto-generated comment: release notes by coderabbit.ai
-->
## Summary by CodeRabbit

- **New Features**
	- Improved GPU initialization to ensure the correct device is utilized.
	- Enhanced error handling for clearer context on exceptions.

- **Bug Fixes**
- Updated error handling in multiple methods to catch and rethrow
specific exceptions.
- Added logic to handle communication-related tensors during
computation.
<!-- end of auto-generated comment: release notes by coderabbit.ai -->

---------

Signed-off-by: Jinzhe Zeng <[email protected]>
  • Loading branch information
njzjz authored Oct 28, 2024
1 parent 39cddd4 commit 04e1159
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions source/api_cc/src/DeepPotPT.cc
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,9 @@ void DeepPotPT::init(const std::string& model,
device = torch::Device(torch::kCPU);
std::cout << "load model from: " << model << " to cpu " << std::endl;
} else {
#if GOOGLE_CUDA || TENSORFLOW_USE_ROCM
DPErrcheck(DPSetDevice(gpu_id));
#endif // GOOGLE_CUDA || TENSORFLOW_USE_ROCM
std::cout << "load model from: " << model << " to gpu " << gpu_id
<< std::endl;
}
Expand Down

0 comments on commit 04e1159

Please sign in to comment.