You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am visualizing a file tree structure with pack. This works great when files are spread throughout the file tree, but the packing algorithm becomes really slow if there a lot of files in a single folder.
For reference, when i try to pack 40k items, it takes about 10 seconds.
Is there a way to make packing items faster when there are a lot of items in the same folder?
I have summed and sorted the hierarchy.
The text was updated successfully, but these errors were encountered:
The packing algorithm described in this essay: https://observablehq.com/@d3/d3-packenclose, and is implemented here for the part that packs all "siblings" (circles that must be packed together at the same level).
Barring new ideas to make the algorithm or the implementation faster, here are a few oblique suggestions:
If the file structure is always the same, you could run the algorithm once, and save its results in a file.
If the structure is always different, maybe the pain comes from the fact that the browser hangs during computation? In that case you could pass it to a web worker. You will still need to wait 10s before you have the result, but the browser could show something else during that time (maybe an approximation computed from the top 10% of the circles…).
Since this is an iterative algorithm, it might be possible to tweak the implementation to yield after every new circle has been added to the pack. So instead of staring for 10s at a blank screen, you would spend the same 10s seeing the structure emerge.
The algorithm appears to be O(N^2) in time—and about 4 times slower when the nodes are sorted by descending radius (the largest in the center), than when the smaller nodes are in the middle.
On my computer I see 1.6s for 40k nodes in the first case, and 0.4s in the second case; I need 135k sorted nodes to get to 10s, and at this point it's the visualization that's starting to slow things down.
I am visualizing a file tree structure with
pack
. This works great when files are spread throughout the file tree, but the packing algorithm becomes really slow if there a lot of files in a single folder.For reference, when i try to pack 40k items, it takes about 10 seconds.
Is there a way to make packing items faster when there are a lot of items in the same folder?
I have summed and sorted the hierarchy.
The text was updated successfully, but these errors were encountered: