Jupyter extension starts multiple idle R kernels, clogging up memory #9409
Replies: 2 comments 2 replies
-
@pehkawn Could you pull the contents of the Jupyter tab of the output window maybe after running a few R notebook files and then closing them (the output can be pretty big, so saving in a file and attaching that might help). In general kernels should be shutting down in VS Code the same way that they would with Jupyter notebook. Output logging might show if something is failing during this shutdown process. Also I wasn't quite sure, are these kernels being spawned just via working with one file? Or are you opening and closing multiple R notebook files in your workflow? |
Beta Was this translation helpful? Give feedback.
-
Closing in favor of tracking as a bug: #9700 |
Beta Was this translation helpful? Give feedback.
-
I've been running R using the Jupyter notebook extension for some time, and for the most part it's working great. I have an issue with how it chews up my memory however. In my system monitor, I can see multiple idle R kernels loaded in memory. They use 0% of CPU, indicating they're just just there without doing anything. During extensive use, the amount of R kernels will keep growing until it takes up all my available memory. Running garbage collect (
gc()
) or clearing up variablesrm(list = ls() )
will do nothing to free up memory, and the only way to clear up my memory is to terminate the process. This I can not think this is working after intention, but I am uncertain whether this is a bug, or the matter of changing a setting. Any input on this would be appreciated.Version info:
VS Code: 1.65.2
Jupyter extension: v2022.2.1030672458
R: 4.1.3
Anaconda: 4.11.0
OS: Ubuntu 20.04.4 LTS
Beta Was this translation helpful? Give feedback.
All reactions