You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to copy data from PyTorch into a Taichi field. I understand the best way to do this is using the .from_torch() method; however, I don't believe this method supports slicing. E.g. if I have a field of shape (A, B, C) and I want to copy a tensor of shape (A, B) into one C available slots I cannot use something like field[:,:,0].from_torch(tensor). Instead I'm copying manually as below; however, this results in the following unexpected warning. Why?
[Taichi] version 1.7.0, llvm 15.0.4, commit 2fd24490, linux, python 3.10.13
[Taichi] Starting on arch=cuda
/home/qpd4588/miniconda3/envs/tds/lib/python3.10/site-packages/taichi/lang/kernel_impl.py:763: UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won't be populated during autograd.backward(). If you indeed want the .grad field to be populated for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more informations. (Triggered internally at /opt/conda/conda-bld/pytorch_1708025847130/work/build/aten/src/ATen/core/TensorBody.h:489.)
if v.requires_grad and v.grad is None:
The text was updated successfully, but these errors were encountered:
This happens I believe because the Taichi kernel modifies the tensor object's grad attribute. If the attribute is currently None then the warning will be thrown. This is not ideal, but the warning can be voided by initializing the .grad to zeros.
I want to copy data from PyTorch into a Taichi field. I understand the best way to do this is using the
.from_torch()
method; however, I don't believe this method supports slicing. E.g. if I have a field of shape (A, B, C) and I want to copy a tensor of shape (A, B) into one C available slots I cannot use something likefield[:,:,0].from_torch(tensor)
. Instead I'm copying manually as below; however, this results in the following unexpected warning. Why?CODE
OUTPUT
The text was updated successfully, but these errors were encountered: