-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[QST] Help w/ exporting Retrieval Model. #1092
Comments
@Tmoradi can you please use
and test again. if you can give us a toy repro example we can debug better. |
Thanks for your reply! I'll get you the toy repo example asap. |
Hi @rnyak, sorry for the late reply. Here's what I've done since the last update. I made a new notebook on vertex ai that uses the container you mentioned to try I ran into the same issues when it came to exporting the query tower. I wasn't sure this would work, but in the Workflow, I tried to add I also created a toy dataset(10k sample of V5) and a notebook that goes through the workflow and the different errors I get. Link to repo |
@rnyak hello, hope you are doing well. I appreciate that you may be busy, but would it be possible to give an eta for when you are able to help? I am going to try to use pytorch/pytorch lightning instead to see if that helps. |
❓ Questions & Help
Details
my current set up is on a vertex ai workbench w/ Tesla T4 GPU, 120 GB RAM, 32vCPUs, and using
nvcr.io/nvidia/merlin/merlin-tensorflow:nightly
container image.I am currently working on a retrieval based model and depending on the version of the dataset I use, I cannot export the query tower.
The different dataset's that I've been using are as follows:
v3: various continuous and categorical features for both users and the items we want to recommend.
v4: features of v3 + item embeddings generated from sentence transformer.
v5: features of v3 + item embeddings generated from sentence transformer + multi-hot encoding of user history as implicit feedback.
Categorify() >> TagAsUserFeatures()
. not sure if this is the cause of the issues but I wanted to include this just in case.so when training a model w/ either v4 or v5 and save the query encoder, I get the following message:
query_tower = model.query_encoder
query_tower.save(...)
when I load the query_tower and then try to turn the model into a top_k_model I get another error.
query_tower = tf.keras.models.load_model(".../query_tower_based_on_v5",compile=False)
model = mm.TwoTowerModelV2(query_tower, candidate)
model.compile()
candidate_features = unique_rows_by_features(train, Tags.ITEM, Tags.ITEM_ID)
topk_model = model.to_top_k_encoder(candidate_features, k=10, batch_size=1)
sorry for cutting the error message off but its talking about the encoder for the query tower.
sorry for being private w/ the data but its not public.
I hope i was able to portray the issue and any suggestions would be appreciated!
The text was updated successfully, but these errors were encountered: