我是优化新手,一直在尝试学习随机搜索。在学习了一些理论之后,我尝试在我以前工作过的MLPC分类器中实现它
def hyperparameter_tune(clf, parameters, iterations, X, y):
randomSearch = RandomizedSearchCV(clf, param_distributions=parameters, n_jobs=-1, n_iter=iterations, cv=6)
randomSearch.fit(X,y)
params = randomSearch.best_params_
score = randomSearch.best_score_
return params, score
此方法在对给定数据集执行随机搜索后返回最佳参数集和最佳分数
parameters = {
'nohn': [150,200,250,300],
'solver': ['sgd', 'adam', 'lbfgs'],
'activation': ['relu', 'tanh']
}
clf = MLPClassifier(batch_size=256, verbose=True, early_stopping=True)
parameters_after_tuning, score_after_tuning = hyperparameter_tune(MLPClassifier, parameters, 20, X_train_pca, y);
print(score)
起初,我只是想优化MLPClassizer的隐藏神经元数量、解算器和激活函数。因此,在创建分类器时,我为其他参数(如batch_size)指定了一个固定值。但当我将分类器传递给HyperParameter_tune方法时,我得到以下错误
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-18-a0c800c38881> in <module>()
1 clf = MLPClassifier()
----> 2 parameters_after_tuning, score_after_tuning = tuning(MLPClassifier, parameters, 20, X_train_pca, y);
3 print(score)
/usr/local/lib/python3.6/dist-packages/sklearn/base.py in clone(estimator, safe)
65 "it does not seem to be a scikit-learn estimator "
66 "as it does not implement a 'get_params' methods."
---> 67 % (repr(estimator), type(estimator)))
68 klass = estimator.__class__
69 new_object_params = estimator.get_params(deep=False)
TypeError: Cannot clone object '<class 'sklearn.neural_network._multilayer_perceptron.MLPClassifier'>' (type <class 'abc.ABCMeta'>): it does not seem to be a scikit-learn estimator as it does not implement a 'get_params' methods.
有人能帮我吗。在我的代码中,很可能有很多事情我可以做得更好。我很想听到任何建议
你的代码中有不止一个bug
您正在将
MLPClassifier
类传递给hyperparameter_tune
,而不是clf
MLP没有
nohn
参数您正在打印不存在的
score
下面是一个运行的最小示例:
相关问题 更多 >
编程相关推荐