这是我在运行代码后得到的结果:
文件"C:\Users\admin\anaconda3\envs\tensorflow_env\lib\site-packages\sagemaker\tuner.py",行484,在_prepare_estimator_for_tuning estimator._prepare_for_training(job_name)中
AttributeError:“DeepAREstimator”对象没有特性“”_prepare_for_training“”
似乎在互联网上很少有关于Amazon sagemaker deepar算法的超参数调整的例子。有人能帮我解决这个问题吗?
import mxnet as mx
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from gluonts.model.deepar import DeepAREstimator
from gluonts.mx.trainer import Trainer
from gluonts.dataset.common import ListDataset
from itertools import islice
from gluonts.evaluation.backtest import make_evaluation_predictions
from sagemaker.tuner import HyperparameterTuner, IntegerParameter, CategoricalParameter, ContinuousParameter
df = pd.read_csv('final.csv', index_col=0,parse_dates=True)
training_data = ListDataset(
[{"start": df.index[0], "target": df.outbound_qty[:pd.to_datetime('2021-01-01')],
"feat_dynamic_real": [df.is_holiday[:pd.to_datetime('2021-01-01')],
df.is_salary[:pd.to_datetime('2021-01-01')],
df.count_qty[:pd.to_datetime('2021-01-01')],
df.shelf_qty[:pd.to_datetime('2021-01-01')]]
}],
freq="D"
)
estimator = DeepAREstimator(freq="D",prediction_length=7,trainer=Trainer(ctx=mx.context.cpu()))
hyperparams = {'learning_rate': ContinuousParameter(0.001, 0.1),
'epochs': IntegerParameter(10, 100),
'context_length': IntegerParameter(7, 90),
'mini_batch_size': IntegerParameter(32, 128)
}
tuner = HyperparameterTuner(estimator=estimator, objective_metric_name="test:RMSE",
objective_type='Minimize',hyperparameter_ranges=hyperparams)
tuner.fit(inputs = training_data)发布于 2021-05-26 17:10:00
你混合了两个不同版本的DeepAR,这就是你出错的原因。DeepAR (Salinas et al.)实际在3个地方实现:
没有证据表明这3个DeepAR实现有任何共同点(除了来自亚马逊网络服务),它们的代码库可能是不同的。
要使用DeepAR运行超参数调优,您有以下几种选择:
亚马逊预测中的
https://stackoverflow.com/questions/67698061
复制相似问题