Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blender crashes after running out of memory #11

Open
Mathboy19 opened this issue Sep 28, 2015 · 2 comments
Open

Blender crashes after running out of memory #11

Mathboy19 opened this issue Sep 28, 2015 · 2 comments

Comments

@Mathboy19
Copy link

When trying to "Visualize in Blender", blender eats up of all my ram and swap, and then when both are at 100%, blender crashes and returns me to the main GUI. (I have 4 GB of ram, and 4 GB of swap.)

Console output:

Color management: using fallback mode for management
connect failed: No such file or directory
Malloc returns null: len=420642816 in dm_dupLoopArray tmp, total 844526948
Malloc returns null: len=157741056 in dm_dupPolyArray tmp, total 844526948
Writing: /tmp/blender.crash.txt
Segmentation fault (core dumped)
Color management: using fallback mode for management
connect failed: No such file or directory
Malloc returns null: len=420642816 in dm_dupLoopArray tmp, total 845117648
Killed
Color management: using fallback mode for management
connect failed: No such file or directory
Killed

As you can see, I tried running "Visualize" three times, with all three times failing. Note that:

Color management: using fallback mode for management
connect failed: No such file or directory

seems to be a Blender thing as it also happens when I run it outside of GUI.py

Oh, and the in-gui console returns the normal "Done, waiting for command" after blender crashes.

I would think that it should be possible to limit the amount of memory that Blender uses, although I'm no expert. Or maybe is there a way to reduce the size of the map, therefore reducing memory usage?

Thanks in advance.

P.S:
The first time I set GUI.py up I had to change the variable path in jsontools.py from
path="/home/jonathan/procedural_city_generation/procedural_city_generation"
to
path="/home/mathboy19/procedural_city_generation/procedural_city_generation"
and it would be nice to have that automated in some way.

@josauder
Copy link
Owner

Thanks a ton for writing this up!

I fixed the path problem in jsontools and was able to reproduce the blender segfault when running it with a very large map.

You can change the map size with the attribute border when clicking on "options" next to roadmap.
If this button does not exist (problem I found when testing it on windows yesterday), you can run

python UI.py roadmap --configure border [15,15]

This should halve the size (and therefore at least quarter the amount of memory used)

For a list of Parameters (and a short description) you can see "_params.py" in every submodule. Each of those parameters should be configurable with the command above in the form:

python UI.py <modulename> --configure <parameter name> <value>

I'm working on a way to make this more intuitive or at least document it properly somewhere.

If you run into more issues please keep posting them here, thanks again ! :)

@Mathboy19
Copy link
Author

Awesome, got it working for now, thanks!

I wounder if there would be any way to limit Blenders memory usage? It may not be possible, but it would be very helpful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants