Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Issues on Large Networks #51

Open
yizhang-zoey opened this issue Dec 23, 2024 · 0 comments
Open

Memory Issues on Large Networks #51

yizhang-zoey opened this issue Dec 23, 2024 · 0 comments

Comments

@yizhang-zoey
Copy link

yizhang-zoey commented Dec 23, 2024

I am applying your algorithm to a large_network (a series of snapshots with 15092 nodes and 458198 events).

Compared to the wild_mice_network data (with 437 nodes and 5751692 events), although the number of events is not large, the number of nodes is much larger.
When I executed it in colab, the running time was very long (some hours) and finally ended with unenough RAM.

I would like to know the time/space complexity of your algorithm.

Is there any possible way to simplify it?

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant