You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am applying your algorithm to a large_network (a series of snapshots with 15092 nodes and 458198 events).
Compared to the wild_mice_network data (with 437 nodes and 5751692 events), although the number of events is not large, the number of nodes is much larger.
When I executed it in colab, the running time was very long (some hours) and finally ended with unenough RAM.
I would like to know the time/space complexity of your algorithm.
Is there any possible way to simplify it?
Thanks.
The text was updated successfully, but these errors were encountered:
I am applying your algorithm to a
large_network
(a series of snapshots with 15092 nodes and 458198 events).Compared to the
wild_mice_network
data (with 437 nodes and 5751692 events), although the number of events is not large, the number of nodes is much larger.When I executed it in colab, the running time was very long (some hours) and finally ended with unenough RAM.
I would like to know the time/space complexity of your algorithm.
Is there any possible way to simplify it?
Thanks.
The text was updated successfully, but these errors were encountered: