在matlab中实现回归模型时遇到错误

问题描述 投票:0回答:1

以下是在MATLAB中实现回归模型时产生的代码和错误。输入向量(traindata(:,1))的大小为 300,输出向量(traindata(:,2))的大小也是 300,但我仍然收到大小不一样的错误。 traindata 是一个大小为 769x2 的单元格,每个元素的长度为 1x300 double。我做错了什么?

代码:

layers1 = [
sequenceInputLayer(1,MinLength = 300)

convolution1dLayer(4,3,Padding="same",Stride=1)

convolution1dLayer(64,8,Padding="same",Stride=8)
batchNormalizationLayer()
tanhLayer
maxPooling1dLayer(2,Padding="same")

convolution1dLayer(32,8,Padding="same",Stride=4)
batchNormalizationLayer
tanhLayer
maxPooling1dLayer(2,Padding="same")

transposedConv1dLayer(32,8,Cropping="same",Stride=4)
tanhLayer

transposedConv1dLayer(64,8,Cropping="same",Stride=8)
tanhLayer

bilstmLayer(8)

fullyConnectedLayer(8)  
dropoutLayer(0.2)

fullyConnectedLayer(4)  
dropoutLayer(0.2)

fullyConnectedLayer(1)
regressionLayer];

options = trainingOptions("adam",...
MaxEpochs=600,...
MiniBatchSize=600,...
InitialLearnRate=0.001,...
ValidationData={valdata(:,1),valdata(:,2)},...
ValidationFrequency=100,...
VerboseFrequency=100,...
Verbose=1, ...
Shuffle="every-epoch",...
Plots="none", ...
DispatchInBackground=true);

[net1,info1] = trainNetwork(traindata(:,1),traindata(:,2),layers1,options);

错误:

Error using trainNetwork
Invalid network.

Error in untitled (line 67)
[net1,info1] = trainNetwork(traindata(:,1),traindata(:,2),layers1,options);

Caused by:
Network: Incompatible input and output sequence lengths. The network must return 
sequences with the same
length as the input data or a sequence with length one.
matlab deep-learning regression
1个回答
0
投票

根据错误消息,您的网络输出大小似乎与训练标签的大小不匹配。您的网络有一个带有单个输出神经元的最终全连接层,它应该为每个输入序列生成一个标量预测。但是,您的训练标签

(traindata(:,2))
的大小是 300,这表明它包含每个输入序列的目标向量。

要解决此错误,您应该确保网络输出的大小与训练标签的大小相匹配。您可以使用序列输出层而不是回归层修改网络的最后一层,以输出与输入数据长度相同的序列。或者,如果您想为每个输入序列预测标量值,则应将训练标签修改为标量值而不是向量。

这里是一个如何修改网络以使用序列输出层的示例:

layers1 = [
    sequenceInputLayer(1, 'Name', 'input')
    convolution1dLayer(4, 3, 'Padding', 'same', 'Stride', 1)
    convolution1dLayer(64, 8, 'Padding', 'same', 'Stride', 8)
    batchNormalizationLayer()
    tanhLayer()
    maxPooling1dLayer(2, 'Padding', 'same')
    convolution1dLayer(32, 8, 'Padding', 'same', 'Stride', 4)
    batchNormalizationLayer()
    tanhLayer()
    maxPooling1dLayer(2, 'Padding', 'same')
    transposedConv1dLayer(32, 8, 'Cropping', 'same', 'Stride', 4)
    tanhLayer()
    transposedConv1dLayer(64, 8, 'Cropping', 'same', 'Stride', 8)
    tanhLayer()
    bilstmLayer(8)
    fullyConnectedLayer(8)
    dropoutLayer(0.2)
    fullyConnectedLayer(4)
    dropoutLayer(0.2)
    sequenceOutputLayer('Name', 'output')
];

options = trainingOptions('adam', ...
    'MaxEpochs', 600, ...
    'MiniBatchSize', 600, ...
    'InitialLearnRate', 0.001, ...
    'ValidationData', {valdata(:, 1), valdata(:, 2)}, ...
    'ValidationFrequency', 100, ...
    'VerboseFrequency', 100, ...
    'Verbose', 1, ...
    'Shuffle', 'every-epoch', ...
    'Plots', 'none', ...
    'DispatchInBackground', true);

[trainSeqs, trainLabels] = sequenceInputData(traindata(:, 1), traindata(:, 2));
[net1, info1] = trainNetwork(trainSeqs, trainLabels, layers1, options);

请注意,

sequenceOutputLayer
用于生成与输入数据长度相同的序列。此外,
sequenceInputData
函数用于将训练数据准备为一系列输入和标签。

最新问题
© www.soinside.com 2019 - 2024. All rights reserved.