Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Predict on batch #195

Open
LKchemposer opened this issue Mar 16, 2022 · 3 comments
Open

Predict on batch #195

LKchemposer opened this issue Mar 16, 2022 · 3 comments
Labels
enhancement New feature or request

Comments

@LKchemposer
Copy link

Is there a way to perform batch prediction to leverage GPU? I believe it is a functionality in the JS version, but I am not sure how to do that in this python version.

@de-code
Copy link
Owner

de-code commented Mar 17, 2022

It wouldn't be difficult to add a predict on batch function.
Internally (except for TF Lite I believe), it is using a batch already.
I think were it becomes a bit tricky is to follow it through with the post processing.

e.g. currently with a single image it looks like this:

result = bodypix_model.predict_single(image_array)

# simple mask
mask = result.get_mask(threshold=0.75)

# colored mask (separate colour for each body part)
colored_mask = result.get_colored_part_mask(mask)

For a batch it would then be something like:

batch_result = bodypix_model.predict_batch(image_array_batch)

# simple mask
mask_batch = batch_result.get_mask_batch(threshold=0.75)

# colored mask (separate colour for each body part)
colored_mask_batch = batch_result.get_colored_part_mask_batch(mask_batch)

I also wonder whether there would be any noticable speed improvement.

What have you observed when using the JS version?
And what is your use-case?

@LKchemposer
Copy link
Author

Thank you for the reply.

My current use case is measuring BodyPix's performance over a dataset of 10000+ images, but given the high image resolution (1080p), it's taking ~1.3-1.5 sec/image on Colab's default CPU. Any improvement on inference time is useful (without having to downsample).

I have not tried with the JS version yet, but it seems that there is "a large performance difference" according to tensorflow/tfjs#2197

@de-code
Copy link
Owner

de-code commented Mar 31, 2022

Would you be happy to see whether you want to submit a PR to add batch support?

@de-code de-code added the enhancement New feature or request label Aug 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants