How to pass custom transformer to a sagemaker model/endpoint? #3628
Unanswered
georgepglv
asked this question in
Q&A
Replies: 2 comments
-
I'm not very familiar with this use case but may be able to help with more info. First, do you have any updates on your end as far as what you've tried? Also could you provide the full debug logs (with any sensitive info redacted) by adding |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hello! Reopening this discussion to make it searchable. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi.
I have a custom transformer in a class that I used to train my model locally.: that is a simple grouping categories with small frequencies in an "other" value.
After I pickle my model and pipeline from Sklearn GridSearch, I have model.tar.gz in my S3 bucket contaning: model.jobline, pipeline.joblib and inference.py.
If I open my pipeline locally and try to transform my data, it works fine.
Then, to deploy my model, I do as here.
It creates everything fine but not the Endpoint.
On the Cloudwatch logs, I get:
I have found similar issues like here and here, but they are using another framework that can pass dependencies to sagemaker.
How can I achieve the same using boto3? I have tried adding my classes.py to the model artifacts but it did not work.
Beta Was this translation helpful? Give feedback.
All reactions