Skip to content

Commit

Permalink
Update model_path in inference config files
Browse files Browse the repository at this point in the history
  • Loading branch information
valhassan committed Apr 17, 2024
1 parent cae1bf7 commit 22ecdaa
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 3 deletions.
3 changes: 1 addition & 2 deletions config/inference/default_binary.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,12 @@ inference:
raw_data_csv: tests/inference/inference_segmentation_binary.csv
root_dir: inferences
input_stac_item: # alternatively, use a path or url to stac item directly
state_dict_path: ${general.save_weights_dir}/
model_path: ${general.save_weights_dir}/
output_path:
checkpoint_dir: # (string, optional): directory in which to save the object if url
chunk_size: # if empty, will be calculated automatically from max_pix_per_mb_gpu
# Maximum number of pixels each Mb of GPU Ram to allow. E.g. if GPU has 1000 Mb of Ram and this parameter is set to
# 10, chunk_size will be set to sqrt(1000 * 10) = 100.
use_hanning: True # enables smoothening with Hann windows; creates overlapping tiles.
max_pix_per_mb_gpu: 25
prep_data_only: False
override_model_params: False
Expand Down
2 changes: 1 addition & 1 deletion config/inference/default_multiclass.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ inference:
raw_data_csv: tests/inference/inference_segmentation_multiclass.csv
root_dir: inferences
input_stac_item: # alternatively, use a path or url to stac item directly
state_dict_path: ${general.save_weights_dir}/
model_path: ${general.save_weights_dir}/
output_path:
checkpoint_dir: # (string, optional): directory in which to save the object if url
chunk_size: # if empty, will be calculated automatically from max_pix_per_mb_gpu
Expand Down

0 comments on commit 22ecdaa

Please sign in to comment.