Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Runtime tensor error when trying to convert cpu model to tflite #5749

Open
itzjac opened this issue Nov 21, 2024 · 1 comment
Open

Runtime tensor error when trying to convert cpu model to tflite #5749

itzjac opened this issue Nov 21, 2024 · 1 comment
Assignees
Labels
os:windows MediaPipe issues on Windows platform:python MediaPipe Python issues task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions

Comments

@itzjac
Copy link

itzjac commented Nov 21, 2024

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

None

OS Platform and Distribution

WSL2

MediaPipe Tasks SDK version

0.10.18

Task name (e.g. Image classification, Gesture recognition etc.)

convert model

Programming Language and version (e.g. C++, Python, Java)

Python

Describe the actual behavior

Runtime Erorr for generating cpu model

Describe the expected behaviour

convert model and generate a tflite file

Standalone code/steps you may have used to try to get what you need

Using the provided LLM inference example as found in github (text-to-text)

Other info / Complete Logs

Running the conversion using the gpu backend works and load on device (is super slow). cpu backend stops the process with the runtime error

RuntimeError: INTERNAL: ; RET_CHECK failure (external/odml/odml/infra/genai/inference/utils/xnn_utils/model_ckpt_util.cc:116) tensor

I tried different ubuntu versions, both generated same runtime error for the cpu backend and worked fine with gpu backend.

@itzjac
Copy link
Author

itzjac commented Nov 21, 2024

The wsl 2 has a default installation ubuntu 24

No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 24.04.1 LTS
Release:        24.04
Codename:       noble

@kuaashish kuaashish added platform:python MediaPipe Python issues task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup os:windows MediaPipe issues on Windows type:support General questions labels Nov 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
os:windows MediaPipe issues on Windows platform:python MediaPipe Python issues task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions
Projects
None yet
Development

No branches or pull requests

2 participants