You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
While trying the Quick Start Guide for model tf_2dunet, the plan initialisation step is failing.
Last few lines from the error message:
File "/home/azureuser/openfl/tests/openfl_e2e/my_workspace/src/tfbrats_inmemory.py", line 29, in __init__
X_train, y_train, X_valid, y_valid = load_from_nifti(parent_dir=data_path,
File "/home/azureuser/openfl/tests/openfl_e2e/my_workspace/src/brats_utils.py", line 94, in load_from_nifti
subdirs = os.listdir(path)
FileNotFoundError: [Errno 2] No such file or directory: '/raid/datasets/MICCAI_BraTS_2019_Data_Training/HGG/0'
To Reproduce
Steps to reproduce the behavior:
Follow the steps mentioned in Quick Start replacing model torch_cnn_mnist with tf_2dunet
Create workspace, certify it.
Generate CSR request for aggregator with CA signing it.
Initialise the plan - fx plan initialize
At this step the error is thrown.
Expected behavior
There should be no error during plan initialisation.
Screenshots
Machine
Ubuntu 22.04
Additional
There is this README.md which mentions dataset structure for MICCAI_BraTS_2019_Data_Training.
But how to download it exactly? Is this mentioned anywhere?
fx plan initialize is currently taking the first entry from data.yaml. You either need to directly overwrite this to point at your dataset, or you can invoke the --input_shape flag if you know the expected data shape
To gain access to the data, originally you needed to send an access request to the MICCAI BraTS challenge, but that Kaggle link actually looks like the proper data. If so, the README.md includes steps to shard the data
Run this code in terminal for 2 collaborators and change n as per number of collaborators as mentioned in the README.
for f in *;
do
d=$(printf $((i%2))); # change n to number of data slices (number of collaborators in federation)
mkdir -p $d;
mv "$f" $d;
let i++;
done
Check the result raid/datasets/MICCAI_BraTS_2019_Data_Training/HGG# tree -L 1
INFO Creating Initial Weights File 🠆 save/tf_2dunet_brats_init.pbuf plan.py:195
INFO FL-Plan hash is 196b877a93866735ca18687a2d1f94ad6dca8a3f0de541f84ca267ccc5fd63be00dd488102c0540c0b4efb434653b2c0 plan.py:287
INFO ['plan_196b877a'] plan.py:222
✔️ OK
For the error mentioned below, I have fix in #1178.
File "<__array_function__ internals>", line 200, in concatenate
ValueError: need at least one array to concatenate
Describe the bug
While trying the Quick Start Guide for model tf_2dunet, the plan initialisation step is failing.
Last few lines from the error message:
To Reproduce
Steps to reproduce the behavior:
fx plan initialize
At this step the error is thrown.
Expected behavior
There should be no error during plan initialisation.
Screenshots
Machine
Additional
There is this README.md which mentions dataset structure for MICCAI_BraTS_2019_Data_Training.
But how to download it exactly? Is this mentioned anywhere?
For practice purpose, I found this link having dataset - https://www.kaggle.com/datasets/aryashah2k/brain-tumor-segmentation-brats-2019 but it contains too many subfolders as opposed to expected 0 and 1.
The text was updated successfully, but these errors were encountered: