You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi - I am running some large json files (some as large as 2.5GB) and am getting a windows memory error popup like this:
"
Close programs to prevent information loss
Your computer is low on memory. Save your files and close these programs:
python.exe
Windows will only close enough programs to restore needed memory.
"
json2csv stops at that point and no CSV is created. Is there a way to have python continue parsing, even against memory issues?
Note that I keep all other programs closed while running the program.
Thanks,
SJB
The text was updated successfully, but these errors were encountered:
Hi, as you can tell I don't have much time for open source projects these days.
I'd started a fix for this issue in https://github.com/evidens/json2csv/tree/13-direct-transcription and I've been told it works. I didn't get around to writing tests for it, which is why I didn't merge it into master.
Sorry if this reaches you too late for your project. Good luck.
Your fix seems to have worked perfectly. Instead of maxing out my memory, the program stayed at around 7Gb the entire time. I was able to parse json files as big as 2.2Gb that have fairly deep nests.
Hi - I am running some large json files (some as large as 2.5GB) and am getting a windows memory error popup like this:
"
Close programs to prevent information loss
Your computer is low on memory. Save your files and close these programs:
python.exe
Windows will only close enough programs to restore needed memory.
"
json2csv stops at that point and no CSV is created. Is there a way to have python continue parsing, even against memory issues?
Note that I keep all other programs closed while running the program.
Thanks,
SJB
The text was updated successfully, but these errors were encountered: