You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So the solution to this issue would be to stagger reading and writing the data from the table during export. Limit the number of rows to read, then write them to a file resource, read the next batch and write it again until done. This is a bigger refactoring and I'm unsure if our Filesystem package properly supports this.
Steps to reproduce the issue
Default php 7.4 configuration from XAMMP,
389597 rows in
#_finder_terms
(127.9 MB on disk)Actual result
System information (as much as possible)
Additional comments
I know, that I can reconfigure server, but I think we should find a solution that requires less memory.
The text was updated successfully, but these errors were encountered: