如何在Keras中用denseNet实现cifar100体系结构?我看到Keras中的密集网络是用imageNet实现的!如何用cifar100实现
发布于 2020-05-06 13:08:09
下面的示例将帮助您理解如何使用cifar100实现DenseNet121。请注意,我在keras中使用了tensorflow。
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.applications import DenseNet121
from tensorflow.keras.preprocessing import image
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D
from tensorflow.keras import backend as K
# import cifar 100 data
# The data, split between train and test sets:
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.cifar100.load_data()
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255
# create the base pre-trained model
base_model = DenseNet121(weights='imagenet', include_top=False)
# add a global spatial average pooling layer
x = base_model.output
x = GlobalAveragePooling2D()(x)
# let's add a fully-connected layer
x = Dense(1024, activation='relu')(x)
# and a logistic layer -- let's say we have 200 classes
predictions = Dense(100)(x)
# this is the model we will train
model = Model(inputs=base_model.input, outputs=predictions)
# first: train only the top layers (which were randomly initialized)
# i.e. freeze all convolutional layers
for layer in base_model.layers:
layer.trainable = False
# compile the model (should be done *after* setting layers to non-trainable)
loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
model.compile(optimizer='rmsprop', loss=loss, metrics=['accuracy'])
# train the model on the new data for a few epochs
model.fit(x_train,y_train,epochs=5, validation_data=(x_test,y_test), verbose=1,batch_size=128)您也可以进行微调,因为我训练了模型,将原始base_model权重保持在冻结状态(原始base_model的权重没有经过训练)。在微调期间,您可以解冻一些层,并再次训练。我还建议您阅读有关ImageDataGenerator的内容,以增强图像,并在测试期间获得更好的准确性。
希望能帮上忙。
https://stackoverflow.com/questions/61634463
复制相似问题