PyTorch 如何在后面变量的计算图中使用中间变量的梯度

问题描述 投票:0回答:1
point_2 = torch.tensor([0.2, 0.8], device=device, requires_grad=True)
p = torch.cat((point_2, torch.tensor([0], device=device)), 0)

x_verts = torch.tensor([0.0, 1.0, 0.0], device=device, requires_grad=True)
y_verts = torch.tensor([0.0, 0.0, 1.0], device=device, requires_grad=True)
z_verts = torch.tensor([0.1, -0.1, 0.2], device=device, requires_grad=True)

v1_2d = torch.cat((torch.index_select(x_verts, 0, torch.tensor([0])), torch.index_select(y_verts, 0, torch.tensor([0])), torch.tensor([0])))
v2_2d = torch.cat((torch.index_select(x_verts, 0, torch.tensor([1])), torch.index_select(y_verts, 0, torch.tensor([1])), torch.tensor([0])))
v3_2d = torch.cat((torch.index_select(x_verts, 0, torch.tensor([2])), torch.index_select(y_verts, 0, torch.tensor([2])), torch.tensor([0])))


area_3 = torch.cross(v2_2d - v1_2d, v3_2d - v1_2d)
area = torch.index_select(area_3, 0, torch.tensor([2]))

alpha_3 = 0.5 * torch.cross(v2_2d - p, v3_2d - p) / area
beta_3 = 0.5 * torch.cross(v3_2d - p, v1_2d - p) / area
gamma_3 = 0.5 * torch.cross(v1_2d - p, v2_2d - p) / area

alpha = torch.index_select(alpha_3, 0, torch.tensor([2]))
beta = torch.index_select(beta_3, 0, torch.tensor([2]))
gamma = torch.index_select(gamma_3, 0, torch.tensor([2]))

z = alpha * torch.index_select(z_verts, 0, torch.tensor([0])) + beta * torch.index_select(z_verts, 0, torch.tensor([1])) + gamma * torch.index_select(z_verts, 0, torch.tensor([2]))

z.backward()


grad_norm = torch.norm(point_2.grad ) # <= disconnection

f = torch.tanh(10.0 * (grad_norm - 2.0))
f.backward() # <= error

print(x_verts.grad)
print(y_verts.grad)
print(z_verts.grad)

我的代码失败了,因为我使用变量的

.grad
值作为另一个变量的输入。我该如何解决这个问题?

python pytorch
1个回答
0
投票

我不确定您使用它的具体场景,或者您期望它如何工作,但是,假设您只想访问梯度值作为图中的新源节点,问题将是通过访问的值

.grad
不是 require_grad true 的张量。

print(point_2.requires_grad) # True
print(point_2.grad.requires_grad) # False

所以你必须克隆并分离该值,然后使 require_grad True

grad_norm = torch.norm(point_2.grad.clone().detach().requires_grad_(True))
© www.soinside.com 2019 - 2024. All rights reserved.