Keras加载模型的精度不一样

2024-04-19 04:52:05 发布

您现在位置:Python中文网/ 问答频道 /正文

我用checkpoint保存我的模型

checkpoint = ModelCheckpoint("Models/FVA_MEL.h5", monitor='val_accuracy', verbose=1, save_best_only=True, mode='max', period=1)

我用load_模型加载我的模型

from keras.models import load_model,save_model modell = load_model("Models/FVA_MEL.h5")

但加载的模型和我尝试保存为.tf的相同测试数据的精度不一样,但它不起作用。请帮帮我


Tags: 模型verbosemodelmodelssaveloadvalmonitor
1条回答
网友
1楼 · 发布于 2024-04-19 04:52:05

首先检查Keras文档:https://keras.io/callbacks/

“save_best_only=True”和上一个历元的权重之间可能存在差异。 你应该使用上一个历元的权重,而不是最好的

骨架(见下面的章节):

#{model}
{model = Sequential()}
{Your model}

#{model.comply}
#{model.compile()}
#{model.fit()}
#{model.save('your_name.h5')}

我的例子如下:

# model
input_shape=(100,50,layers)
model = Sequential()
activation = 'relu'
model.add(Conv2D(filters=16, kernel_size=(3,3),data_format='channels_last', input_shape=input_shape, padding='same', activation=activation))
model.add(Flatten())
model.add(Dense(256, activation=activation))
model.add(Dropout(0.2))
model.add(Dense(n_levels, activation='softmax'))
# model.save() # if you want to save model structure only (no weights)

# or (if you have saved model in JSON)
#with open(os.path.join(models_wd,model_name), 'r') as model_file:
#    model = model_from_json(model_file.read())
#    model_file.close()

#{model.compile}
adam = optimizers.Adam()
early_stopping = EarlyStopping(monitor='val_loss', min_delta=0.001, patience=12, verbose=1, mode='auto', baseline=None, restore_best_weights=False)
learning_rate_reduction = ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=5, verbose=1, mode='auto', cooldown=0, min_lr=0.00001)
model.compile(loss='categorical_crossentropy', metrics=['accuracy'], optimizer=adam)
#{model.summary}
model.summary()
#{model.fit}
history = model.fit(xtr, ytr, batch_size=8, epochs=n_epochs, verbose=2, validation_data=(xest,yest), callbacks=[learning_rate_reduction, early_stopping])

#{model.save('your_name.h5')}               
model.save('your_name.h5')

然后在您的_name.h5文件中,您将获得与上一个历元相同的权重,这将在加载模型后给出相同的结果

相关问题 更多 >