-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak for large files #605
Comments
What packages should I install to load this? |
After some first tests, I see signifcant increase in (remaining) memory usage after the first load but it does not increase after that. |
Sorry, yeah the package that this data originates from is not public. I can try to see if the same problem occurs with “random” data. I wonder whether this is a linux vs. windows issue due to differences in handling mmaping? The machine that this occurs on has 30GB RAM which is should be more than enough to find non-fragmented memory blocks for this. Another specialty of the machine this happened on is that it doesn’t have any swap. |
I also tested on linux. |
Hi @lassepe, I saw that your file contained a single large dictionary. If you were to split the dict into smaller datasets |
The following file created with JLD2 0.4.53 causes memory leaks on my system (Ubuntu 22.04, Julia 1.11.0-rc3):
https://drive.google.com/file/d/1_mjdRDD-DhrEsLoVy31sDis5sGpRo-mW/view?usp=sharing
Specifically, if I load the contained file as
foo = load_object("training_data.jld2")
and then dofoo = nothing; GC.gc(true)
, the memory is never freed again. Hence, after a few consecutive loads, my julia sessions goes OOM.The text was updated successfully, but these errors were encountered: