GridSearchCV has no attribute grid.grid_scores_
12,086
In latest scitkit-learn libaray, grid_scores_ has been depreciated and it has been replaced with cv_results_
cv_results_ give detailed results of grid search run.
grid.cv_results_.keys()
Output: dict_keys(['mean_fit_time', 'std_fit_time', 'mean_score_time', 'std_score_time', 'param_n_estimators', 'params', 'split0_test_score',
'split1_test_score', 'split2_test_score', 'split3_test_score', 'split4_test_score',
'mean_test_score', 'std_test_score', 'rank_test_score'])
cv_results_ gives detailed output compared to grid_score. The resultant output is in form of dictionary. We can extract relevant metrics from dictionary by iterating through keys of dictionary. Below is example of running grid-search for cv=5
for i in ['mean_test_score', 'std_test_score', 'param_n_estimators']:
print(i," : ",grid.cv_results_[i])
Output: mean_test_score : [0.833 0.83 0.83 0.837 0.838 0.8381 0.83]
std_test_score : [0.011 0.009 0.010 0.0106 0.010 0.0102 0.0099]
param_n_estimators : [20 30 40 50 60 70 80]
Author by
Admin
Updated on June 16, 2022Comments
-
Admin almost 2 years
tried grid.cv_results_ didnt correct problem
from sklearn.model_selection import GridSearchCV params = { 'decisiontreeclassifier__max_depth': [1, 2], 'pipeline-1__clf__C': [0.001, 0.1, 100.0] } grid = GridSearchCV(estimator = mv_clf, param_grid = params, cv = 10, scoring = 'roc_auc') grid.fit(X_train, y_train) for params, mean_score, scores in grid.grid_scores_: print("%0.3f+/-%0.2f %r" % (mean_score, scores.std() / 2, params)) #AttributeError: 'GridSearchCV' object has no attribute 'grid_scores_'
tried replacing
grid.grid_scores_
withgrid.cv_results_
The objective is to print the different hyperparameter value combinations and the average ROC AUC scores computed via the 10-fold cross validationfrom sklearn.model_selection import GridSearchCV params = { 'decisiontreeclassifier__max_depth': [1, 2], 'pipeline-1__clf__C': [0.001, 0.1, 100.0] } grid = GridSearchCV(estimator = mv_clf, param_grid = params, cv = 10, scoring = 'roc_auc') grid.fit(X_train, y_train) for params, mean_score, scores in grid.grid_scores_: print("%0.3f+/-%0.2f %r" % (mean_score, scores.std() / 2, params)) #AttributeError: 'GridSearchCV' object has no attribute 'grid_scores_'
-
Rohan Kumar almost 4 yearsthe answer given above & the answer on this link stackoverflow.com/a/59496696/11134789 really helped
-
raquelhortab over 2 yearsis it possible to make it include the validation scores for each epoch as well? I cannot find the way. I'm trying to plot validation vs trainins histories.