You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My dataset is only around 25gb in memory, but after training for a few hours the memory usage is already more than 100gb and it keeps slowly but constantly increasing. I'm using a custom dataset created with tf.data.Dataset.from_generator. Do you know where the problem could be?
Using 1 gpu (3090), with custom cuda, batch of 4, no labels. The rest is more or less the original code.
The text was updated successfully, but these errors were encountered:
Hi,
My dataset is only around 25gb in memory, but after training for a few hours the memory usage is already more than 100gb and it keeps slowly but constantly increasing. I'm using a custom dataset created with tf.data.Dataset.from_generator. Do you know where the problem could be?
Using 1 gpu (3090), with custom cuda, batch of 4, no labels. The rest is more or less the original code.
The text was updated successfully, but these errors were encountered: