迁移学习模型学得不多,验证稳定在 45%,而训练则高达 90%

问题描述 投票:0回答:1

我一直在研究这个图像分类模型。我有 70000 张图像和 375 个类。我尝试过使用 Vgg16、Xception、Resnet 和 Mobilenet 来训练它......并且我在验证中总是得到 45% 的相同限制。

As you can see here

我尝试添加丢失层和正则化,它得到了相同的验证结果。 数据增强也没有多大帮助 有什么想法为什么这不起作用吗?

这是我使用的最后一个模型的代码片段:

from keras.models import Sequential
from keras.layers import Dense
from tensorflow import keras
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from keras import regularizers
from PIL import Image
from PIL import ImageFile
ImageFile.LOAD_TRUNCATED_IMAGES = True

validation_datagen = ImageDataGenerator(rescale=1./255)
target_size = (height, width)

datagen = ImageDataGenerator(rescale=1./255,
    validation_split=0.2)

train_generator = datagen.flow_from_directory(
    path,
    target_size=(height, width),
    batch_size=batchSize,
    shuffle=True,
    class_mode='categorical',
    subset='training')

validation_generator = datagen.flow_from_directory(
    path,
    target_size=(height, width),
    batch_size=batchSize,
    class_mode='categorical',
    subset='validation')

num_classes = len(train_generator.class_indices)



xception_model = Xception(weights='imagenet',input_shape=(width, height, 3), include_top=False,classes=num_classes)
x = xception_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(512, activation='relu')(x)
out = Dense(num_classes, activation='softmax')(x)

opt = Adam()
model.compile(optimizer=opt, loss='categorical_crossentropy', metrics=['accuracy'])


n_epochs = 15

history = model.fit(
    train_generator,
    steps_per_epoch = train_generator.samples // batchSize,
    validation_data = validation_generator, 
    validation_steps = validation_generator.samples // batchSize,
    verbose=1, 
    epochs = n_epochs)
tensorflow keras deep-learning neural-network computer-vision
1个回答
0
投票

是的,您可能需要数据集中每个类别之间的平衡数据集,以获得更好的模型训练性能。请更改

class_mode='sparse'
loss='sparse_categorical_crossentropy'
重试,因为您正在使用图像数据集。同时冻结预训练的模型层
'xception_model.trainable = False'

检查下面的代码:(我使用了 5 个类别的 flower 数据集

xception_model = tf.keras.applications.Xception(weights='imagenet',input_shape=(width, height, 3), include_top=False,classes=num_classes)
xception_model.trainable = False
x = xception_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(32, activation='relu')(x)
out = Dense(num_classes, activation='softmax')(x)

opt = tf.keras.optimizers.Adam()
model = keras.Model(inputs=xception_model.input, outputs=out)
model.compile(optimizer=opt, loss='sparse_categorical_crossentropy', metrics=['accuracy'])
history = model.fit(train_generator,  epochs=10, validation_data=validation_generator)

输出:

Epoch 1/10
217/217 [==============================] - 23s 95ms/step - loss: 0.5945 - accuracy: 0.7793 - val_loss: 0.4610 - val_accuracy: 0.8337
Epoch 2/10
217/217 [==============================] - 20s 91ms/step - loss: 0.3439 - accuracy: 0.8797 - val_loss: 0.4550 - val_accuracy: 0.8419
Epoch 3/10
217/217 [==============================] - 20s 93ms/step - loss: 0.2570 - accuracy: 0.9150 - val_loss: 0.4437 - val_accuracy: 0.8384
Epoch 4/10
217/217 [==============================] - 20s 91ms/step - loss: 0.2040 - accuracy: 0.9340 - val_loss: 0.4592 - val_accuracy: 0.8477
Epoch 5/10
217/217 [==============================] - 20s 91ms/step - loss: 0.1649 - accuracy: 0.9494 - val_loss: 0.4686 - val_accuracy: 0.8512
Epoch 6/10
217/217 [==============================] - 20s 92ms/step - loss: 0.1301 - accuracy: 0.9589 - val_loss: 0.4805 - val_accuracy: 0.8488
Epoch 7/10
217/217 [==============================] - 20s 93ms/step - loss: 0.0966 - accuracy: 0.9754 - val_loss: 0.4993 - val_accuracy: 0.8442
Epoch 8/10
217/217 [==============================] - 20s 91ms/step - loss: 0.0806 - accuracy: 0.9806 - val_loss: 0.5488 - val_accuracy: 0.8372
Epoch 9/10
217/217 [==============================] - 20s 91ms/step - loss: 0.0623 - accuracy: 0.9864 - val_loss: 0.5802 - val_accuracy: 0.8360
Epoch 10/10
217/217 [==============================] - 22s 100ms/step - loss: 0.0456 - accuracy: 0.9896 - val_loss: 0.6005 - val_accuracy: 0.8360
© www.soinside.com 2019 - 2024. All rights reserved.