GridSearchCV returns a different value










0















I split my dataset in two parts: training set and test set. For now just forget the test set and use the training set with the function GridSearchCV of the package sklearn.model_selection to search the best parameters for an SVM:



Cs = [0.001, 0.01, 0.1, 1, 10, 100, 1000]
gammas = [0.001, 0.01, 0.1, 1]

# Set the parameters by cross-validation
param_grid = ['kernel': ['rbf'], 'gamma': gammas, 'C': Cs]

clf = GridSearchCV(svm.SVC(), param_grid = param_grid, cv=nfolds, verbose=1.)
clf.fit(x_train, labels)


after found my best C and gamma parameters, I create an SVM and I fit it with the training set (used before to search the best C and gamma):



model = svm.SVC(kernel='rbf', C = clf.best_params_['C'], gamma = clf.best_params_['gamma']) 
model.fit(x_train, y_train)


At this point I tried one thing, I used the predict() function of the GridSearchCV object and the one of the svm.SVC object:



predicted_label1 = model.predict(x_test)
predicted_label2 = clf.predict(x_test)


and then I used the classification_report(y_test, predicted_label) to valuate my two predicted_label vectors. In my mind I should obtain the same values but this not happens...Here my output:



 precision recall f1-score support

0.0 0.24 0.97 0.39 357
1.0 0.00 0.00 0.00 358
2.0 0.00 0.00 0.00 357
3.0 0.00 0.00 0.00 357

avg / total 0.06 0.24 0.10 1429

fine parametri
training set and test set saved
Create SVM classifier
precision recall f1-score support

0.0 0.70 0.63 0.66 357
1.0 0.89 0.90 0.90 358
2.0 0.89 0.94 0.91 357
3.0 0.85 0.88 0.86 357

avg / total 0.83 0.84 0.83 1429


The first is from the GridSearchCV and the second from the SVM...
Is this normal?
What does GridSearchCV returns? Does it fit with the passed training set?










share|improve this question
























  • Should be the same... But you made lots of edits to your question, so are you sure the training data is the same between the CV and the final SVC? (train_set vs x/y_train?)

    – Matthieu Brucher
    Nov 15 '18 at 21:12












  • yes it is the same

    – BossShell
    Nov 15 '18 at 21:13











  • I fix the text. What do you think I'm doing wrong?

    – BossShell
    Nov 15 '18 at 21:16











  • can you add some data?

    – seralou
    Nov 15 '18 at 23:26






  • 1





    In GridSearchCV you used x_train, labels and in SVC you used x_train, y_train. Are you sure they are exact same? Can you make a simple reproducible code with some data if possible. I am getting same scores on sklearn inbuilt datasets.

    – Vivek Kumar
    Nov 16 '18 at 6:43















0















I split my dataset in two parts: training set and test set. For now just forget the test set and use the training set with the function GridSearchCV of the package sklearn.model_selection to search the best parameters for an SVM:



Cs = [0.001, 0.01, 0.1, 1, 10, 100, 1000]
gammas = [0.001, 0.01, 0.1, 1]

# Set the parameters by cross-validation
param_grid = ['kernel': ['rbf'], 'gamma': gammas, 'C': Cs]

clf = GridSearchCV(svm.SVC(), param_grid = param_grid, cv=nfolds, verbose=1.)
clf.fit(x_train, labels)


after found my best C and gamma parameters, I create an SVM and I fit it with the training set (used before to search the best C and gamma):



model = svm.SVC(kernel='rbf', C = clf.best_params_['C'], gamma = clf.best_params_['gamma']) 
model.fit(x_train, y_train)


At this point I tried one thing, I used the predict() function of the GridSearchCV object and the one of the svm.SVC object:



predicted_label1 = model.predict(x_test)
predicted_label2 = clf.predict(x_test)


and then I used the classification_report(y_test, predicted_label) to valuate my two predicted_label vectors. In my mind I should obtain the same values but this not happens...Here my output:



 precision recall f1-score support

0.0 0.24 0.97 0.39 357
1.0 0.00 0.00 0.00 358
2.0 0.00 0.00 0.00 357
3.0 0.00 0.00 0.00 357

avg / total 0.06 0.24 0.10 1429

fine parametri
training set and test set saved
Create SVM classifier
precision recall f1-score support

0.0 0.70 0.63 0.66 357
1.0 0.89 0.90 0.90 358
2.0 0.89 0.94 0.91 357
3.0 0.85 0.88 0.86 357

avg / total 0.83 0.84 0.83 1429


The first is from the GridSearchCV and the second from the SVM...
Is this normal?
What does GridSearchCV returns? Does it fit with the passed training set?










share|improve this question
























  • Should be the same... But you made lots of edits to your question, so are you sure the training data is the same between the CV and the final SVC? (train_set vs x/y_train?)

    – Matthieu Brucher
    Nov 15 '18 at 21:12












  • yes it is the same

    – BossShell
    Nov 15 '18 at 21:13











  • I fix the text. What do you think I'm doing wrong?

    – BossShell
    Nov 15 '18 at 21:16











  • can you add some data?

    – seralou
    Nov 15 '18 at 23:26






  • 1





    In GridSearchCV you used x_train, labels and in SVC you used x_train, y_train. Are you sure they are exact same? Can you make a simple reproducible code with some data if possible. I am getting same scores on sklearn inbuilt datasets.

    – Vivek Kumar
    Nov 16 '18 at 6:43













0












0








0








I split my dataset in two parts: training set and test set. For now just forget the test set and use the training set with the function GridSearchCV of the package sklearn.model_selection to search the best parameters for an SVM:



Cs = [0.001, 0.01, 0.1, 1, 10, 100, 1000]
gammas = [0.001, 0.01, 0.1, 1]

# Set the parameters by cross-validation
param_grid = ['kernel': ['rbf'], 'gamma': gammas, 'C': Cs]

clf = GridSearchCV(svm.SVC(), param_grid = param_grid, cv=nfolds, verbose=1.)
clf.fit(x_train, labels)


after found my best C and gamma parameters, I create an SVM and I fit it with the training set (used before to search the best C and gamma):



model = svm.SVC(kernel='rbf', C = clf.best_params_['C'], gamma = clf.best_params_['gamma']) 
model.fit(x_train, y_train)


At this point I tried one thing, I used the predict() function of the GridSearchCV object and the one of the svm.SVC object:



predicted_label1 = model.predict(x_test)
predicted_label2 = clf.predict(x_test)


and then I used the classification_report(y_test, predicted_label) to valuate my two predicted_label vectors. In my mind I should obtain the same values but this not happens...Here my output:



 precision recall f1-score support

0.0 0.24 0.97 0.39 357
1.0 0.00 0.00 0.00 358
2.0 0.00 0.00 0.00 357
3.0 0.00 0.00 0.00 357

avg / total 0.06 0.24 0.10 1429

fine parametri
training set and test set saved
Create SVM classifier
precision recall f1-score support

0.0 0.70 0.63 0.66 357
1.0 0.89 0.90 0.90 358
2.0 0.89 0.94 0.91 357
3.0 0.85 0.88 0.86 357

avg / total 0.83 0.84 0.83 1429


The first is from the GridSearchCV and the second from the SVM...
Is this normal?
What does GridSearchCV returns? Does it fit with the passed training set?










share|improve this question
















I split my dataset in two parts: training set and test set. For now just forget the test set and use the training set with the function GridSearchCV of the package sklearn.model_selection to search the best parameters for an SVM:



Cs = [0.001, 0.01, 0.1, 1, 10, 100, 1000]
gammas = [0.001, 0.01, 0.1, 1]

# Set the parameters by cross-validation
param_grid = ['kernel': ['rbf'], 'gamma': gammas, 'C': Cs]

clf = GridSearchCV(svm.SVC(), param_grid = param_grid, cv=nfolds, verbose=1.)
clf.fit(x_train, labels)


after found my best C and gamma parameters, I create an SVM and I fit it with the training set (used before to search the best C and gamma):



model = svm.SVC(kernel='rbf', C = clf.best_params_['C'], gamma = clf.best_params_['gamma']) 
model.fit(x_train, y_train)


At this point I tried one thing, I used the predict() function of the GridSearchCV object and the one of the svm.SVC object:



predicted_label1 = model.predict(x_test)
predicted_label2 = clf.predict(x_test)


and then I used the classification_report(y_test, predicted_label) to valuate my two predicted_label vectors. In my mind I should obtain the same values but this not happens...Here my output:



 precision recall f1-score support

0.0 0.24 0.97 0.39 357
1.0 0.00 0.00 0.00 358
2.0 0.00 0.00 0.00 357
3.0 0.00 0.00 0.00 357

avg / total 0.06 0.24 0.10 1429

fine parametri
training set and test set saved
Create SVM classifier
precision recall f1-score support

0.0 0.70 0.63 0.66 357
1.0 0.89 0.90 0.90 358
2.0 0.89 0.94 0.91 357
3.0 0.85 0.88 0.86 357

avg / total 0.83 0.84 0.83 1429


The first is from the GridSearchCV and the second from the SVM...
Is this normal?
What does GridSearchCV returns? Does it fit with the passed training set?







python machine-learning scikit-learn svm grid-search






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 15 '18 at 21:12







BossShell

















asked Nov 15 '18 at 21:09









BossShellBossShell

1098




1098












  • Should be the same... But you made lots of edits to your question, so are you sure the training data is the same between the CV and the final SVC? (train_set vs x/y_train?)

    – Matthieu Brucher
    Nov 15 '18 at 21:12












  • yes it is the same

    – BossShell
    Nov 15 '18 at 21:13











  • I fix the text. What do you think I'm doing wrong?

    – BossShell
    Nov 15 '18 at 21:16











  • can you add some data?

    – seralou
    Nov 15 '18 at 23:26






  • 1





    In GridSearchCV you used x_train, labels and in SVC you used x_train, y_train. Are you sure they are exact same? Can you make a simple reproducible code with some data if possible. I am getting same scores on sklearn inbuilt datasets.

    – Vivek Kumar
    Nov 16 '18 at 6:43

















  • Should be the same... But you made lots of edits to your question, so are you sure the training data is the same between the CV and the final SVC? (train_set vs x/y_train?)

    – Matthieu Brucher
    Nov 15 '18 at 21:12












  • yes it is the same

    – BossShell
    Nov 15 '18 at 21:13











  • I fix the text. What do you think I'm doing wrong?

    – BossShell
    Nov 15 '18 at 21:16











  • can you add some data?

    – seralou
    Nov 15 '18 at 23:26






  • 1





    In GridSearchCV you used x_train, labels and in SVC you used x_train, y_train. Are you sure they are exact same? Can you make a simple reproducible code with some data if possible. I am getting same scores on sklearn inbuilt datasets.

    – Vivek Kumar
    Nov 16 '18 at 6:43
















Should be the same... But you made lots of edits to your question, so are you sure the training data is the same between the CV and the final SVC? (train_set vs x/y_train?)

– Matthieu Brucher
Nov 15 '18 at 21:12






Should be the same... But you made lots of edits to your question, so are you sure the training data is the same between the CV and the final SVC? (train_set vs x/y_train?)

– Matthieu Brucher
Nov 15 '18 at 21:12














yes it is the same

– BossShell
Nov 15 '18 at 21:13





yes it is the same

– BossShell
Nov 15 '18 at 21:13













I fix the text. What do you think I'm doing wrong?

– BossShell
Nov 15 '18 at 21:16





I fix the text. What do you think I'm doing wrong?

– BossShell
Nov 15 '18 at 21:16













can you add some data?

– seralou
Nov 15 '18 at 23:26





can you add some data?

– seralou
Nov 15 '18 at 23:26




1




1





In GridSearchCV you used x_train, labels and in SVC you used x_train, y_train. Are you sure they are exact same? Can you make a simple reproducible code with some data if possible. I am getting same scores on sklearn inbuilt datasets.

– Vivek Kumar
Nov 16 '18 at 6:43





In GridSearchCV you used x_train, labels and in SVC you used x_train, y_train. Are you sure they are exact same? Can you make a simple reproducible code with some data if possible. I am getting same scores on sklearn inbuilt datasets.

– Vivek Kumar
Nov 16 '18 at 6:43












0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53327935%2fgridsearchcv-returns-a-different-value%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53327935%2fgridsearchcv-returns-a-different-value%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Top Tejano songwriter Luis Silva dead of heart attack at 64

ReactJS Fetched API data displays live - need Data displayed static

政党