全连接层的形状错误:mat1 和 mat2 形状无法相乘

问题描述 投票:0回答:1

我有以下型号。它训练得很好。我的劈叉形状是:

  • X_train (98, 1, 40, 844)
  • X_val (21, 1, 40, 844)
  • X_测试(21,1,40,844)

但是,我在

x = F.relu(self.fc1(x))
中的
forward
处收到以下错误。当我尝试解释验证集上的模型时。

# Create a DataLoader for the validation set
valid_dl = learn.dls.test_dl(X_val, y_val)

# Get predictions and interpret them on the validation set
interp = ClassificationInterpretation.from_learner(learn, dl=valid_dl) 

RuntimeError: mat1 and mat2 shapes cannot be multiplied (32x2110 and 67520x128)

我检查了几十个类似的问题,但找不到解决方案。

class DraftCNN(nn.Module):
    def __init__(self):
        super(AudioCNN, self).__init__()
        self.conv1 = nn.Conv2d(1, 16, kernel_size=3, stride=1, padding=1)
        self.pool = nn.MaxPool2d(kernel_size=2, stride=2, padding=0)
        self.conv2 = nn.Conv2d(16, 32, kernel_size=3, stride=1, padding=1)
        
        # Calculate flattened size based on input dimensions
        with torch.no_grad():
            dummy_input = torch.zeros(1, 1, 40, 844)  # shape of one input sample
            dummy_output = self.pool(self.conv2(self.pool(F.relu(self.conv1(dummy_input)))))
            self.flattened_size = dummy_output.view(dummy_output.size(0), -1).size(1)
        
        self.fc1 = nn.Linear(self.flattened_size, 128)
        self.fc2 = nn.Linear(128, 4)

    def forward(self, x):
        x = self.pool(F.relu(self.conv1(x)))
        x = self.pool(F.relu(self.conv2(x)))
        x = x.view(x.size(0), -1)  # Flatten the output of convolutions
        x = F.relu(self.fc1(x))
 
   x = self.fc2(x)
    return x

我尝试更改前向函数和层的形状,但我不断收到相同的错误。

python pytorch neural-network fast-ai
1个回答
0
投票

对于矩阵(m1 和 m2)相乘

mat1 的列数应等于 mat2 的行数

在您的情况下,您的矩阵是:

m1 is 32    x 2110 
m2 is 67520 x 128

一种选择是重塑

mat2

67520//2110 = 32

所以 mat2 变成:

67520 x 128 = 2110 X 32 X 128
            = 2110 X 4096

现在,您的

m1
m2
是:

(32 x 2110) and (2110 X 4096)
© www.soinside.com 2019 - 2024. All rights reserved.