Pytorch的线性回归给我Na N值

问题描述 投票:0回答:1

我正在用pytorch学习回归(利润与rd)。我做了这个脚本:

url =https://raw.githubusercontent.com/LakshmiPanguluri/Linear_Multiple_Regression/master/50_Startups.csv
starup = pd.read_csv(url)

profit = np.array(starup['Profit']).reshape(-1,1)
rd = np.array(starup['R&D Spend']).reshape(-1,1)
marketing = np.array(starup['Marketing Spend']).reshape(-1,1)
administration = np.array(starup['Administration']).reshape(-1,1)

profit_torch = torch.from_numpy(profit).float()
rd_torch = torch.from_numpy(rd).float().requires_grad_(True)

model = nn.Linear(1,1)
loss_function = nn.MSELoss()
optimizer = optim.SGD(model.parameters(), lr=0.010)

losses = []
iterations = 1000
for i in range(iterations):
  pred = model(rd_torch)
  loss = loss_function(pred, profit_torch)
  losses.append(loss.data)

  optimizer.zero_grad()
  loss.backward()
  optimizer.step()

print(loss)
plt.plot(range(iterations), losses)
tensor(nan, grad_fn=<MeanBackward0>)

我的问题是为什么它给我一个带有nan值的张量,以及为什么损耗在每次迭代中都在增加。

The plot of the losses is a line with a positive slopeThis is the project I've been doing

python pytorch linear-regression nan gradient-descent
1个回答
0
投票

我建议进行以下更改:

更改1:删除requires_grad_(True)

rd_torch = torch.from_numpy(rd).float()

更改2:训练循环之前包括model.train()

...
model.train()
for i in range(iterations):
...

更改3:使用loss.item()

losses.append(loss.item())
© www.soinside.com 2019 - 2024. All rights reserved.