我已经编写了自己的CustomClassifier,它将因变量二值化。这是密码
class OwnClassifier(BaseEstimator, ClassifierMixin):
def __init__(self, estimator=None):
self.yt = None
if estimator is None:
estimator = LogisticRegression(solver='liblinear')
self.estimator = estimator
self.discr = KBinsDiscretizer(n_bins=4, encode='ordinal')
def fit(self, X, y):
self.yt = y.copy()
self.yt = self.discr.fit_transform(self.yt.reshape(-1, 1)).astype(int)
self.estimator.fit(X,self.yt.ravel())
return self
def predict(self, X):
return self.estimator.predict(X)
def predict_proba(self, X):
return self.estimator.predict_proba(X)
def score(self, X, y=None):
return accuracy_score(self.yt, self.predict(X))
在其上使用GridSearchCV时,会抛出一个错误:
grid = [{'estimator__C': [1, 10, 100, 1000]}]
myLogi = OwnClassifier()
gridCv = GridSearchCV(myLogi, grid)
gridCv.fit(X, y)
分类器如何与GridSearchCV兼容
我使用波士顿住房数据
boston_data = load_boston()
X = boston_data['data']
y = boston_data['target']
错误:
ValueError: Found input variables with inconsistent numbers of samples: [404, 102]
问题在于score方法,因为您强制它始终使用训练数据
self.yt
来计算精度,这就是回溯说形状不兼容的原因。这已在以下代码中修复:相关问题 更多 >
编程相关推荐