PyTorch - autograd - One of the differentiated Tensors appears to not have been used in the graph

墨承泽
2023-12-01

参考资料

Example for One of the differentiated Tensors appears to not have been used in the graph - #3 by Sudarshan_VB - autograd - PyTorch Forums

allow_unused=True - 知乎

pytorch 踩坑记录_kdh的专栏-CSDN博客

python - Pytorch gradient error: nonetype unsupported operand type(s) for +: 'NoneType' and 'NoneType' - Stack Overflow

关于DiffPool和MAML的结合 | Zhimeng's Personal Website | Welcome!

问题描述

计算梯度时,报错。产生原因是,input 和 output可能不是直接关系,即,两个无关的变量自然没有办法求导。

比如有两个计算图,x->y->z 和 y->m, 那么可以求z到x的梯度,但是你求不到 z 到 m的梯度。

例子

a = torch.rand(10, requires_grad=True)
b = torch.rand(10, requires_grad=True)

output = (2 * a).sum()

torch.autograd.grad(output, (a, b))

if b is not in the graph then the derivative is just 0 everywhere. You don’t need to add it to the graph to get the derivatives.

output is not a function of b.

解决方法

网上有不少建议是,设置 allow_unused=True,这个往往无法解决问题。得到的结果会出现一个None,raise新的error如下

TypeError: unsupported operand type(s) for *: 'float' and 'NoneType'

靠谱的解决方法是,重新梳理一下,torch.autograd.grad的两个variables的计算图关系。

 调整合适的variable or tensor,把output 和 x 的函数关系弄对即可。

grad = torch.autograd.grad(output, x)

 

 类似资料: