This respository contains my code for competition in kaggle.
7th Place Solution for Bristol-Myers Squibb – Molecular Translation
Team: Mr_KnowNothing, Shivam Gupta, Phaedrus, Nischay Dhankhar, atfujita
- All models(Team)
Public LB: 0.60(7th)
Private LB: 0.60(7th)
The full picture of our solution is here
Note: This repository contains only my models and only train script.
- My models(3 Models averaging)
Public LB: 0.66
Private LB: 0.66
Note: This repogitory is based on hengck23's great assets. Please check here for details
- Encoder: vit_deit_base_distilled_patch16_384
- Decoder: TransformerDecoder
- Loss: LabelSmoothingLoss
- Augumentation: RandomScale, Cutout
There are 2 Vit based models.
The second was re-training by strengthening Noize Injection and Augmentation.
Vit model1
- Public LB: 0.77 (With Normalize)
- Private LB: 0.78 (With Normalize)
Vit model2
- Public LB: 0.76 (With Normalize)
- Private LB: 0.77 (With Normalize)
- Encoder: swin_base_patch4_window12_384_in22k
- Decoder: TransformerDecoder
- Loss: LabelSmoothingLoss
- Augumentation: RandomScale, Cutout
Swin model
- Public LB: 0.91 (With Normalize)
- Private LB: 0.92 (With Normalize)