-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: Add support for embedding layers #646
Comments
Hi @dopc, Could you share the error you are getting? Some parts from the transformer architecture might be missing (e.g. softmax). They will be supported soon. For now you can check the use case available on LLMs. It's using GPT2: https://github.com/zama-ai/concrete-ml/tree/main/use_case_examples/llm. |
Hey @jfrery, thanks for your quick reply. Good to see GPT2 model works! I will look at it. Here is my error trace:
|
Hey my bad I missed your answer. Concrete ML doesn't support embedding layer. We will support that very soon. If you don't mind, we can convert your issue to a Feature request to make sure we have the embedding supported asap. |
Thanks for the answer and for converting the issue to a feature request. |
Hey,
I already created an issue on Huggingface.
One of the issues can be closed if you need to.
I want to use
concerete-ml
for the Transformer model, such as BERT.Do you have any resources to look at or advice you could give for this?
I already tried the distilbert ner and conll2003 by duplicating and modifying this model, however, I have not succeeded yet.
Thanks,
Best.
The text was updated successfully, but these errors were encountered: