You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In torch, there is no way to know if a second derivative call might be executed by the user (unlike for the first derivative, where requires_grad can be checked).
As a result, in the current API, we require the user to specify if second derivatives will be used at class initialization. This is pretty useless for two reasons:
if the model is torchscripted, the class can't be re-initialized so one is stuck with first derivatives only
the model can fail silently if second derivatives are called and they were not requested at initialization
The only way I see to make this feature usable is to calculate the second derivatives on the fly when their calculation is needed. This will recompute the values and first derivatives of the spherical harmonics, but the current approach which avoids the recomputation is unsustainable in practice. We should also find a way to mark the second derivative function as non-differentiable to avoid, once again, silent failures if people try to differentiate 3 or more times. Something similar to @once_differentiable (https://discuss.pytorch.org/t/what-does-the-function-wrapper-once-differentiable-do/31513), but for C++ torch.
The text was updated successfully, but these errors were encountered:
In torch, there is no way to know if a second derivative call might be executed by the user (unlike for the first derivative, where
requires_grad
can be checked).As a result, in the current API, we require the user to specify if second derivatives will be used at class initialization. This is pretty useless for two reasons:
The only way I see to make this feature usable is to calculate the second derivatives on the fly when their calculation is needed. This will recompute the values and first derivatives of the spherical harmonics, but the current approach which avoids the recomputation is unsustainable in practice. We should also find a way to mark the second derivative function as non-differentiable to avoid, once again, silent failures if people try to differentiate 3 or more times. Something similar to
@once_differentiable
(https://discuss.pytorch.org/t/what-does-the-function-wrapper-once-differentiable-do/31513), but for C++ torch.The text was updated successfully, but these errors were encountered: