You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question what would be the best practice to render a large dataset. In my case I have the objects that I want to create the labels for and then objects in the background with changing Images as Input in BDSF or changing Hue of simple objects.
From my understanding I can't use keyframes as I change the images/materials of the background objects. Basically a plane with changing COCO images (or cc textures) for a Background. Please correct me if I'm wrong!
After rendering +5000 images in one folder I experience a big bottleneck at reading/writing the COCO json file after rendering a new image. Like rendering 2 sec and then read/write 10+ seconds the json file on 1 core. Reading/writing with multiple cores would be definitely really nice (like bop now #994)
So is there a way to avoid this and what is the best way to create big datasets time efficiently when you can't use keyframes.
Maybe store multiple "render" and "render_segmap" and write multiple images with the "write_coco_annotations"?
Somewhere else in the issues here I read that it is not recommended to render a lot images with one script. Does this only apply for rendering a lot keyframes for one "render" or also for multiple "render" in one python file?
Best regards and thanks in advance!
Minimal code example
No response
Files required to run the code
No response
Expected behavior
Writing the COCO json file on multiple cores.
BlenderProc version
2.6.2
The text was updated successfully, but these errors were encountered:
I don't think the problem is only the reading/writing json file. I ran a python script over the night and experienced following.
When I started the script and generated images in a new folder (new json) I could generate ca. 30-35 images per minute.
After 15 hours running the python script I could only generate ca. 4 images in a new folder (new json).
I experienced this with 2 python scripts that I ran parallel over the night.
Describe the issue
Hello everyone,
I have a question what would be the best practice to render a large dataset. In my case I have the objects that I want to create the labels for and then objects in the background with changing Images as Input in BDSF or changing Hue of simple objects.
From my understanding I can't use keyframes as I change the images/materials of the background objects. Basically a plane with changing COCO images (or cc textures) for a Background. Please correct me if I'm wrong!
After rendering +5000 images in one folder I experience a big bottleneck at reading/writing the COCO json file after rendering a new image. Like rendering 2 sec and then read/write 10+ seconds the json file on 1 core. Reading/writing with multiple cores would be definitely really nice (like bop now #994)
So is there a way to avoid this and what is the best way to create big datasets time efficiently when you can't use keyframes.
Maybe store multiple "render" and "render_segmap" and write multiple images with the "write_coco_annotations"?
Somewhere else in the issues here I read that it is not recommended to render a lot images with one script. Does this only apply for rendering a lot keyframes for one "render" or also for multiple "render" in one python file?
Best regards and thanks in advance!
Minimal code example
No response
Files required to run the code
No response
Expected behavior
Writing the COCO json file on multiple cores.
BlenderProc version
2.6.2
The text was updated successfully, but these errors were encountered: