我试图训练一个 bert 模型来解决多分类问题:
参数
和target
必须具有相同的形状。收到:target.shape=(无,512),output.shape=(无,3)output
import tensorflow as tf
epochs = 4
train_dataloader = train_dataset.shuffle(buffer_size=10000).batch(batch_size)
validation_dataloader = val_dataset.batch(batch_size)
# start training
history = model.fit(
train_dataloader, # train_data
validation_data=validation_dataloader, # validation_data
epochs=epochs,
verbose=1
)
# save the model
model.save("bert_model.h5")
for batch in train_dataloader.take(1):
input_ids, attention_masks, labels = batch
print("Batch input_ids shape:", input_ids.shape)
print("Batch attention_masks shape:", attention_masks.shape)
print("Batch labels shape:", labels.shape)
# I got this output
Batch input_ids shape: (16, 512)
Batch attention_masks shape: (16, 512)
Batch labels shape: (16,)
我已经检查过张量形状。 希望得到解答!
您的标签的形状为 (16,),而模型的输出的形状为 (None,3)。
问题可能是您的标签不是“one-hot 编码”。它们应该具有与输出层相同的第二维:
from tensorflow.keras.utils import to_categorical
num_classes = 3
labels = to_categorical(labels, num_classes=num_classes)
print(labels.shape)