尝试使用torch优化函数参数但损失保持不变

问题描述 投票:0回答:1

我正在尝试优化函数的参数,但损失保持不变

class BoardOptimizer(torch.nn.Module):
    def __init__(self):
        super().__init__()

        # Initialize parameters with current values but make them trainable
        self.highpass_freq = torch.nn.Parameter(torch.tensor(100.0), requires_grad=True)

    def get_pedalboard(self):
        highpass_freq = torch.clamp(self.highpass_freq, 20, 500)
        
        board = Pedalboard([
            HighpassFilter(cutoff_frequency_hz=float(highpass_freq)),
        ])
        
        return board

    def forward(self, sample_audio, sample_rate):
        board = self.get_pedalboard()
        music = board.process(sample_audio, sample_rate)
        return music

model = BoardOptimizer()
criterion = torch.nn.MSELoss()
optimizer = torch.optim.SGD(model.parameters(), lr=1e-6)
print(type(criterion), y.shape, y_pred.shape)
for t in range(2000):
    y_pred = torch.Tensor(model(x, 16000))

    loss = criterion(y_pred, y)
    if t % 100 == 99:
        print(t, loss.item())

    optimizer.zero_grad()
    loss.requires_grad = True
    loss.backward()
    optimizer.step()
99 0.011413631960749626
199 0.011413631960749626
299 0.011413631960749626
399 0.011413631960749626

我认为错误是因为没有打开渐变。

尝试打印这个

for name, param in model.named_parameters():
        print(f"{name}: grad = {param.grad}")

但我不断

highpass_freq: grad = None
python pytorch
1个回答
0
投票

loss
应该明确要求评估梯度,所以我会删除
loss.requires_grad = True
线。另外,尝试重写第一行

def get_pedalboard(self):
    highpass_freq = self.highpass_freq.clamp(20, 500) 

你确定吗

board = Pedalboard([
            HighpassFilter(cutoff_frequency_hz=float(highpass_freq)),
        ])

可以与 PyTorch autograd 一起使用吗?

© www.soinside.com 2019 - 2024. All rights reserved.