You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am confused by the network output in the code. For the self-supervised training, are the outputs consist of 1 channel for disparity(inv depth) and 1 channel for variance of the depth (not inv depth) ? Cause when I try to train under the setting of : prediction for normal distribution, outputs include disp and var of depth, it usually shows like pixels with large disp(small depth) correspond to large var of depth. This is not consistent with the results in the paper ( I think small depth should correpond to low uncertainty).
Hope you can help with this! Thanks a lot!
The text was updated successfully, but these errors were encountered:
Without self-distillation and for normal distributions, the outputs consist of 1 channel for disparity and 1 channel between 0 and 1 that, when it is multiplied to the mean depth (obtained from disparity predictions), provides the standard deviation of the depth distribution.
With self-distillation and for normal distributions, the outputs consist of 1 channel for disparity and 1 channel that is directly the standard deviation of the depth distribution following the work of Poggi et al.
Dear authors,
Thanks for your nice work!
I am confused by the network output in the code. For the self-supervised training, are the outputs consist of 1 channel for disparity(inv depth) and 1 channel for variance of the depth (not inv depth) ? Cause when I try to train under the setting of : prediction for normal distribution, outputs include disp and var of depth, it usually shows like pixels with large disp(small depth) correspond to large var of depth. This is not consistent with the results in the paper ( I think small depth should correpond to low uncertainty).
Hope you can help with this! Thanks a lot!
The text was updated successfully, but these errors were encountered: