This repository contains a project on semantic segmentation using airbus ship detection dataset.
- Task: Semantic Segmentation
- Data: Link
- Neural Network: U-Net like architecture
- Scoring function: Dice score (f1 score)
- Download data
- Download Kaggle Competition data
- Download mask images (to save time) Link
- Prepare folder structure
- Create
data
folder and extract Kaggle data into it - Create
masks_v2
folder insidedata
folder. Extract mask images intomasks_v2
folder
- Create
- Run
eda.ipynb
to create training dataframe
python train.py --backbone=mobilenetv2
--backbone
- choose UNet encoder backbone
python train.py -h
for more details
python inference.py --download --visualize_inference
--download
- Download pre-trained model
--compare_to_gt
- Compare prediction to Ground Truth
--visualize_inference
- Visualize 10 inference results
--predict_all
- Takes a long time! Predict on all test images
--show_submission
- Visualize random predictions from RLE encoded submission.scv
python inference.py -h
for more details
Trained 30 epochs, on 5k images (4k training and 1k validation), batch size 10, using basic image augmentation to aid in training. Metric - f1-score, loss function - binary crossentropy.
├───data # Data folder
│ └─── masks_v2 # RLE decoded masks
├───EDA
│ └───eda.ipynb # EDA and initial data prep
├───train.py # Define model
├───inference.py # Model inference
├───results # Store model execution results
│ ├───inference # Model inference results
│ ├───validation # Model validation data predictions
│ └───submission # Images from predicted RLE
├───checkpoints # Folder with best model checkpoints
├───constants.py # Declare variables
├───data_prep.py # Create data generators and augment pictures
├───helper_funcs.py
├───metrics.py
├───.gitignore
└───requirements.txt
Used it to run all model training View
Project was very challenging as I had 0 previous experience with neural networks and keras, but it was very fun nonetheless.