我正在使用Keras,我试图用这里解释的示例对https://www.tensorflow.org/tutorials/keras/keras_tuner进行超参数调优。代码运行良好,但当我启动代码时,当我试图启动第二次和第三次时,我将面临问题。
tuner.search(X_train, Y_train, epochs=50, validation_split=0.2, callbacks=[stop_early])
# Get the optimal hyperparameters
best_hps=tuner.get_best_hyperparameters(num_trials=1)[0]
print(f"""
The hyperparameter search is complete. The optimal number of units in the first densely-connected
layer is {best_hps.get('units')} and the optimal learning rate for the optimizer
is {best_hps.get('learning_rate')}.
""")在第二次执行代码之后,不要启动并显示上一次的结果。
INFO:tensorflow:Oracle triggered exit
The hyperparameter search is complete. The optimal number of units in the first densely-connected
layer is 128 and the optimal learning rate for the optimizer
is 0.001.知道怎么解决这个问题吗?
发布于 2022-05-28 15:28:37
Keras正在将检查点保存在gcs或本地dir中的目录中。这意味着,如果一个人想要继续搜索以后使用。由于您的搜索已经完成之前,再次运行搜索将不会做任何事情。您必须先删除该目录才能重新启动搜索。
在您的示例中,在调谐器搜索之前,您将有以下内容:
tuner = kt.Hyperband(model_builder,
objective='val_accuracy',
max_epochs=10,
factor=3,
directory='my_dir',
project_name='intro_to_kt')这是要删除的目录。
下一次,若要在启动前自动删除,可以将该代码更改为:
tuner = kt.Hyperband(model_builder,
objective='val_accuracy',
max_epochs=10,
factor=3,
directory='my_dir',
project_name='intro_to_kt',
# if True, overwrite above directory if search is run again - i.e. don't resume
overwrite = True)https://stackoverflow.com/questions/71536602
复制相似问题