Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Perf: load data systems on rank 0 #4478

Open
wants to merge 9 commits into
base: devel
Choose a base branch
from
25 changes: 11 additions & 14 deletions deepmd/pt/utils/dataloader.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,26 +87,23 @@
with h5py.File(systems) as file:
systems = [os.path.join(systems, item) for item in file.keys()]

self.systems: list[DeepmdDataSetForLoader] = []
if len(systems) >= 100:
log.info(f"Constructing DataLoaders from {len(systems)} systems")

def construct_dataset(system):
return DeepmdDataSetForLoader(
system=system,
type_map=type_map,
)

with Pool(
os.cpu_count()
// (
int(os.environ["LOCAL_WORLD_SIZE"])
if dist.is_available() and dist.is_initialized()
else 1
)
) as pool:
self.systems = pool.map(construct_dataset, systems)

self.systems: list[DeepmdDataSetForLoader] = []
global_rank = dist.get_rank() if dist.is_initialized() else 0
if global_rank == 0:
log.info(f"Constructing DataLoaders from {len(systems)} systems")
with Pool(os.cpu_count()) as pool:
self.systems = pool.map(construct_dataset, systems)
else:
self.systems = [None] * len(systems) # type: ignore

Check warning on line 103 in deepmd/pt/utils/dataloader.py

View check run for this annotation

Codecov / codecov/patch

deepmd/pt/utils/dataloader.py#L103

Added line #L103 was not covered by tests
if dist.is_initialized():
dist.broadcast_object_list(self.systems)
assert self.systems[-1] is not None

Check warning on line 106 in deepmd/pt/utils/dataloader.py

View check run for this annotation

Codecov / codecov/patch

deepmd/pt/utils/dataloader.py#L105-L106

Added lines #L105 - L106 were not covered by tests
self.sampler_list: list[DistributedSampler] = []
self.index = []
self.total_batch = 0
Expand Down
Loading