-
Notifications
You must be signed in to change notification settings - Fork 629
Is the trained projection head available? #199
Comments
yes, the projection head weights should also be included in gs://simclr-checkpoints/simclrv2/pretrained/... |
In the github README that link is under the description "Pretrained SimCLRv2 models (with linear eval head):" I assumed "with linear eval head" refers to the classification layer for ImageNet, but downloading the model r50_1x_sk0 from: the model output is of dimension 2048, which could be either the output from the resnet50, or the output of the projection head. so to confirm: Thanks! |
Both projection head's and supervised linear head's weights are available in the checkpoints. I suppose you're using hub module? If so, you could choose output by providing signature that's available in
|
I'm struggling to actually get the projection representations and still not quite certain what to do based on the previous comments in this thread. Does anyone have a minimal working example of loading the pre-trained model, pushing an input through, and getting the representation from the projection head? |
Thanks for the great repo! I wanted to follow-up to explore whether this issue has been reconciled? I'm also trying to access the projection representations. Specifically, I'd like to be able to pass in an image and get out just the representation (prior to the class-level logits). What layer should I use for this? If I load a model as follows:
The keys available when running inference on a new image as follows:
Which of these is the key associated with the representation? final_avg_pool? Thank you for any insight @chentingpc or others! |
Hi final_avg_pool is the output of the resnet which is used for linear probing. hope that helps |
Thank you @chentingpc !! That's great to know! |
I am interested in downloading a pre-trained simCLR model with the projection head, to retrieve the latent features z, upon which the contrastive loss was applied.
Is this layer + pre-trained weights available somewhere?
The text was updated successfully, but these errors were encountered: