python (keras) 中的递归神经网络错误:ValueError:`logits` 和 `labels` 必须具有相同的形状,收到 ((None, 90, 1) vs (None,))

问题描述 投票:0回答:1

我正在使用 keras 在 python 中开发循环神经网络,以对轮盘赌数据进行二进制分类。我正在尝试编译我的代码,但它崩溃了,你能帮我修复代码吗?

这是我的代码:

from keras.models import Sequential
from keras.layers import Dense, Dropout
from sklearn.preprocessing import MinMaxScaler
import numpy as np
import pandas as pd

columns = ['data', 'resultado']
base = pd.read_csv("blaze_values_27_01_2023_VERMELHO_1.csv", header = None, names = columns)
base = base.dropna()
base_treinamento = base.iloc[:, 1:2]

normalizador = MinMaxScaler(feature_range=[0,1])
base_treinamento_normalizada = normalizador.fit_transform(base_treinamento)


previsores = []
saida_real = []

for i in range (90,1809):
    previsores.append(base_treinamento_normalizada[i-90:i,0])
    saida_real.append(base_treinamento_normalizada[i,0])
previsores, saida_real = np.array(previsores), np.array(saida_real)
previsores = np.reshape(previsores, (previsores.shape[0],previsores.shape[1],1))

regressor = Sequential()
regressor.add(Dense(100, input_shape = (previsores.shape[1],1), activation='relu'))



regressor.add(Dense(1, activation = 'sigmoid'))


regressor.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
regressor.fit(previsores, saida_real, epochs = 100, batch_size = 32)

我得到的错误是: 纪元 1/100 追溯(最后一次通话):

文件“/Users/mac/opt/anaconda3/lib/python3.9/site-packages/spyder_kernels/py3compat.py”,第 356 行,在 compat_exec exec(代码、全局变量、本地变量)

文件“/Users/mac/untitled0.py”,第 34 行,位于 regressor.fit(previsores, saida_real, epochs = 100, batch_size = 32)

文件“/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/utils/traceback_utils.py”,第 67 行,在 error_handler 中 从 None 提高 e.with_traceback(filtered_tb)

文件“/var/folders/1j/tbck9lp54kndrb4nl53xdjgr0000gp/T/autograph_generated_file27ts368.py”,第 15 行,在 tf__train_function 中 retval = ag__.converted_call(ag__.ld(step_function), (ag__.ld(self), ag__.ld(iterator)), None, fscope)

ValueError:在用户代码中:

File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/engine/training.py", line 1051, in train_function  *
    return step_function(self, iterator)
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/engine/training.py", line 1040, in step_function  **
    outputs = model.distribute_strategy.run(run_step, args=(data,))
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/engine/training.py", line 1030, in run_step  **
    outputs = model.train_step(data)
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/engine/training.py", line 890, in train_step
    loss = self.compute_loss(x, y, y_pred, sample_weight)
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/engine/training.py", line 948, in compute_loss
    return self.compiled_loss(
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/engine/compile_utils.py", line 201, in __call__
    loss_value = loss_obj(y_t, y_p, sample_weight=sw)
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/losses.py", line 139, in __call__
    losses = call_fn(y_true, y_pred)
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/losses.py", line 243, in call  **
    return ag_fn(y_true, y_pred, **self._fn_kwargs)
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/losses.py", line 1930, in binary_crossentropy
    backend.binary_crossentropy(y_true, y_pred, from_logits=from_logits),
File "/Users/mac/opt/anaconda3/lib/python3.9/site-packages/keras/backend.py", line 5283, in binary_crossentropy
    return tf.nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)

ValueError: `logits` and `labels` must have the same shape, received ((None, 90, 1) vs (None,)).
python keras deep-learning neural-network
1个回答
0
投票

删除这个:previsores = np.reshape(previsores, (previsores.shape[0],previsores.shape[1],1))

试试这个:

regressor.add(layers.Dense(100, input_shape = (previsores.shape[1],), activation='relu'))  

代替 regressor.add(Dense(100, input_shape = (previsores.shape[1],1), activation='relu'))


另一件事,作为第一层的密集层不是循环神经网络,你应该试试 LSTM。在那种情况下,您可以保持该形状。 例如:

regressor.add(layers.LSTM(100, input_shape = (previsores.shape[1], 1)))
© www.soinside.com 2019 - 2024. All rights reserved.