contour plot for regression predict with fixed input variables










0















I want to create a contour plot for a prediction with multiple features. The remaining values should be fixed to plot the 2 interesting values. Unfortunately I resulting matrix has the same value on all positions instead of the expected.



I think something with my matrixes is wrong, but I don't find the error.



[...]
f_learn = [x_1,x_2,x_3,x_4]
r_lear = [r_1]

clf = svm.MLPRegressor(...)
clf.fit(f_learn,r_learn)
[...]

x_1 = np.linspace(1, 100, 100)
x_2 = np.linspace(1, 100, 100)
X_1, X_2 = np.meshgrid(x_1, x_2)

x_3 = np.full( (100,100), 5).ravel()
x_4 = np.full( (100,100), 15).ravel()

predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3,x_4])
prediction = clf.predict(predict_matrix.T)

prediction_plot = prediction.reshape(X_1.shape)

plt.figure()
cp = plt.contourf(X_1, X_2, prediction_plot, 10)
plt.colorbar(cp)
plt.show()


If I test the matrix line by line by hand I get the right results. However, it doesn't work if I put them together this way.



Edit: made a error copying the code



Example with Data. All answer are 7.5 and not diffrent ;(



import matplotlib.pyplot as plt
import numpy as np
from sklearn import linear_model

f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
r_learn = np.array([6,7,8,9])

reg = linear_model.LinearRegression()
reg.fit (f_learn, r_learn)

x_1 = np.linspace(0, 20, 10)
x_2 = np.linspace(0, 20, 10)
X_1, X_2 = np.meshgrid(x_1, x_2)

x_3 = np.full( (10,10), 5).ravel()
x_4 = np.full( (10,10), 2).ravel()

predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
prediction = reg.predict(predict_matrix.T)

prediction_plot = prediction.reshape(X_1.shape)

plt.figure()
cp = plt.contourf(X_1, X_2, prediction_plot, 10)
plt.colorbar(cp)
plt.show()


Result










share|improve this question




























    0















    I want to create a contour plot for a prediction with multiple features. The remaining values should be fixed to plot the 2 interesting values. Unfortunately I resulting matrix has the same value on all positions instead of the expected.



    I think something with my matrixes is wrong, but I don't find the error.



    [...]
    f_learn = [x_1,x_2,x_3,x_4]
    r_lear = [r_1]

    clf = svm.MLPRegressor(...)
    clf.fit(f_learn,r_learn)
    [...]

    x_1 = np.linspace(1, 100, 100)
    x_2 = np.linspace(1, 100, 100)
    X_1, X_2 = np.meshgrid(x_1, x_2)

    x_3 = np.full( (100,100), 5).ravel()
    x_4 = np.full( (100,100), 15).ravel()

    predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3,x_4])
    prediction = clf.predict(predict_matrix.T)

    prediction_plot = prediction.reshape(X_1.shape)

    plt.figure()
    cp = plt.contourf(X_1, X_2, prediction_plot, 10)
    plt.colorbar(cp)
    plt.show()


    If I test the matrix line by line by hand I get the right results. However, it doesn't work if I put them together this way.



    Edit: made a error copying the code



    Example with Data. All answer are 7.5 and not diffrent ;(



    import matplotlib.pyplot as plt
    import numpy as np
    from sklearn import linear_model

    f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
    r_learn = np.array([6,7,8,9])

    reg = linear_model.LinearRegression()
    reg.fit (f_learn, r_learn)

    x_1 = np.linspace(0, 20, 10)
    x_2 = np.linspace(0, 20, 10)
    X_1, X_2 = np.meshgrid(x_1, x_2)

    x_3 = np.full( (10,10), 5).ravel()
    x_4 = np.full( (10,10), 2).ravel()

    predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
    prediction = reg.predict(predict_matrix.T)

    prediction_plot = prediction.reshape(X_1.shape)

    plt.figure()
    cp = plt.contourf(X_1, X_2, prediction_plot, 10)
    plt.colorbar(cp)
    plt.show()


    Result










    share|improve this question


























      0












      0








      0








      I want to create a contour plot for a prediction with multiple features. The remaining values should be fixed to plot the 2 interesting values. Unfortunately I resulting matrix has the same value on all positions instead of the expected.



      I think something with my matrixes is wrong, but I don't find the error.



      [...]
      f_learn = [x_1,x_2,x_3,x_4]
      r_lear = [r_1]

      clf = svm.MLPRegressor(...)
      clf.fit(f_learn,r_learn)
      [...]

      x_1 = np.linspace(1, 100, 100)
      x_2 = np.linspace(1, 100, 100)
      X_1, X_2 = np.meshgrid(x_1, x_2)

      x_3 = np.full( (100,100), 5).ravel()
      x_4 = np.full( (100,100), 15).ravel()

      predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3,x_4])
      prediction = clf.predict(predict_matrix.T)

      prediction_plot = prediction.reshape(X_1.shape)

      plt.figure()
      cp = plt.contourf(X_1, X_2, prediction_plot, 10)
      plt.colorbar(cp)
      plt.show()


      If I test the matrix line by line by hand I get the right results. However, it doesn't work if I put them together this way.



      Edit: made a error copying the code



      Example with Data. All answer are 7.5 and not diffrent ;(



      import matplotlib.pyplot as plt
      import numpy as np
      from sklearn import linear_model

      f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
      r_learn = np.array([6,7,8,9])

      reg = linear_model.LinearRegression()
      reg.fit (f_learn, r_learn)

      x_1 = np.linspace(0, 20, 10)
      x_2 = np.linspace(0, 20, 10)
      X_1, X_2 = np.meshgrid(x_1, x_2)

      x_3 = np.full( (10,10), 5).ravel()
      x_4 = np.full( (10,10), 2).ravel()

      predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
      prediction = reg.predict(predict_matrix.T)

      prediction_plot = prediction.reshape(X_1.shape)

      plt.figure()
      cp = plt.contourf(X_1, X_2, prediction_plot, 10)
      plt.colorbar(cp)
      plt.show()


      Result










      share|improve this question
















      I want to create a contour plot for a prediction with multiple features. The remaining values should be fixed to plot the 2 interesting values. Unfortunately I resulting matrix has the same value on all positions instead of the expected.



      I think something with my matrixes is wrong, but I don't find the error.



      [...]
      f_learn = [x_1,x_2,x_3,x_4]
      r_lear = [r_1]

      clf = svm.MLPRegressor(...)
      clf.fit(f_learn,r_learn)
      [...]

      x_1 = np.linspace(1, 100, 100)
      x_2 = np.linspace(1, 100, 100)
      X_1, X_2 = np.meshgrid(x_1, x_2)

      x_3 = np.full( (100,100), 5).ravel()
      x_4 = np.full( (100,100), 15).ravel()

      predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3,x_4])
      prediction = clf.predict(predict_matrix.T)

      prediction_plot = prediction.reshape(X_1.shape)

      plt.figure()
      cp = plt.contourf(X_1, X_2, prediction_plot, 10)
      plt.colorbar(cp)
      plt.show()


      If I test the matrix line by line by hand I get the right results. However, it doesn't work if I put them together this way.



      Edit: made a error copying the code



      Example with Data. All answer are 7.5 and not diffrent ;(



      import matplotlib.pyplot as plt
      import numpy as np
      from sklearn import linear_model

      f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
      r_learn = np.array([6,7,8,9])

      reg = linear_model.LinearRegression()
      reg.fit (f_learn, r_learn)

      x_1 = np.linspace(0, 20, 10)
      x_2 = np.linspace(0, 20, 10)
      X_1, X_2 = np.meshgrid(x_1, x_2)

      x_3 = np.full( (10,10), 5).ravel()
      x_4 = np.full( (10,10), 2).ravel()

      predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
      prediction = reg.predict(predict_matrix.T)

      prediction_plot = prediction.reshape(X_1.shape)

      plt.figure()
      cp = plt.contourf(X_1, X_2, prediction_plot, 10)
      plt.colorbar(cp)
      plt.show()


      Result







      python matplotlib machine-learning scikit-learn






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 14 '18 at 14:10







      Neithard

















      asked Nov 14 '18 at 9:31









      NeithardNeithard

      32




      32






















          3 Answers
          3






          active

          oldest

          votes


















          0














          In your toy data, there are 4 examples with the same feature values and different label. LinearRegression doeasn't learn anything from it. You can check it by:



          >>> reg.coef_
          [0. 0. 0. 0.]


          Maybe it is also the case in your real data. That the features x_1, x_2 doesn't matter. Check reg.coef_ if there are not too small values for features x_1, x_2.



          I changed the toy data and the plot is working.



          import matplotlib.pyplot as plt
          import numpy as np
          from sklearn import linear_model

          # f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
          # r_learn = np.array([6,7,8,9])
          f_learn = np.arange(20.).reshape(5, 4)
          f_learn += np.random.randn(5, 4)
          r_learn = f_learn[:, 0] + 2 * f_learn[:, 1] + 3 * f_learn[:, 2] + 4 * f_learn[:, 3]

          reg = linear_model.LinearRegression()
          reg.fit(f_learn, r_learn)
          print(reg.coef_)

          x_1 = np.linspace(0, 20, 10)
          x_2 = np.linspace(0, 20, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          x_3 = np.full( (10,10), 5).ravel()
          x_4 = np.full( (10,10), 2).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()


          enter image description here






          share|improve this answer























          • Thanky you! I will test this solution. you were very helpful to me, I can already rule out this mistake.

            – Neithard
            Nov 15 '18 at 9:19


















          0














          Try something like this. Some comments in the code



          x_1 = np.linspace(1, 100, 100)
          x_2 = np.linspace(1, 100, 100)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          # Why the shape was (1000, 100)?
          x_3 = np.full((100, 100), 5).ravel()
          x_4 = np.full((100, 100), 15).ravel()

          # you should use X_1.ravel() to make it column vector (it is one feature)
          # there was x_3 insted of x_4
          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = clf.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer























          • Hi, sorry I forgot to add revel when i copied the part from my code. I replaced all the variable names with x_ in hope to make it more understandable. i edited my question, as this is unfortunately not the solution. the prediction is still not correct.

            – Neithard
            Nov 14 '18 at 11:44











          • Can you provide some toy data in the example?

            – Tomáš Přinda
            Nov 14 '18 at 13:12











          • added example with toy data, I'm not the best questions writer.

            – Neithard
            Nov 14 '18 at 14:12


















          0














          Following code should give you the contour plot you want.



          from sklearn.datasets import make_regression

          f_learn, r_learn = make_regression(20,4)

          reg = linear_model.LinearRegression()
          reg.fit (f_learn, r_learn)

          x_1 = np.linspace(-2, 2, 10)
          x_2 = np.linspace(-2, 2, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)


          x_3 = np.full( (10,10), 0.33).ravel()
          x_4 = np.full( (10,10), 0.99).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer























          • Thank you for you help. As this was not the best question!

            – Neithard
            Nov 15 '18 at 9:20










          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53296909%2fcontour-plot-for-regression-predict-with-fixed-input-variables%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          3 Answers
          3






          active

          oldest

          votes








          3 Answers
          3






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          In your toy data, there are 4 examples with the same feature values and different label. LinearRegression doeasn't learn anything from it. You can check it by:



          >>> reg.coef_
          [0. 0. 0. 0.]


          Maybe it is also the case in your real data. That the features x_1, x_2 doesn't matter. Check reg.coef_ if there are not too small values for features x_1, x_2.



          I changed the toy data and the plot is working.



          import matplotlib.pyplot as plt
          import numpy as np
          from sklearn import linear_model

          # f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
          # r_learn = np.array([6,7,8,9])
          f_learn = np.arange(20.).reshape(5, 4)
          f_learn += np.random.randn(5, 4)
          r_learn = f_learn[:, 0] + 2 * f_learn[:, 1] + 3 * f_learn[:, 2] + 4 * f_learn[:, 3]

          reg = linear_model.LinearRegression()
          reg.fit(f_learn, r_learn)
          print(reg.coef_)

          x_1 = np.linspace(0, 20, 10)
          x_2 = np.linspace(0, 20, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          x_3 = np.full( (10,10), 5).ravel()
          x_4 = np.full( (10,10), 2).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()


          enter image description here






          share|improve this answer























          • Thanky you! I will test this solution. you were very helpful to me, I can already rule out this mistake.

            – Neithard
            Nov 15 '18 at 9:19















          0














          In your toy data, there are 4 examples with the same feature values and different label. LinearRegression doeasn't learn anything from it. You can check it by:



          >>> reg.coef_
          [0. 0. 0. 0.]


          Maybe it is also the case in your real data. That the features x_1, x_2 doesn't matter. Check reg.coef_ if there are not too small values for features x_1, x_2.



          I changed the toy data and the plot is working.



          import matplotlib.pyplot as plt
          import numpy as np
          from sklearn import linear_model

          # f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
          # r_learn = np.array([6,7,8,9])
          f_learn = np.arange(20.).reshape(5, 4)
          f_learn += np.random.randn(5, 4)
          r_learn = f_learn[:, 0] + 2 * f_learn[:, 1] + 3 * f_learn[:, 2] + 4 * f_learn[:, 3]

          reg = linear_model.LinearRegression()
          reg.fit(f_learn, r_learn)
          print(reg.coef_)

          x_1 = np.linspace(0, 20, 10)
          x_2 = np.linspace(0, 20, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          x_3 = np.full( (10,10), 5).ravel()
          x_4 = np.full( (10,10), 2).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()


          enter image description here






          share|improve this answer























          • Thanky you! I will test this solution. you were very helpful to me, I can already rule out this mistake.

            – Neithard
            Nov 15 '18 at 9:19













          0












          0








          0







          In your toy data, there are 4 examples with the same feature values and different label. LinearRegression doeasn't learn anything from it. You can check it by:



          >>> reg.coef_
          [0. 0. 0. 0.]


          Maybe it is also the case in your real data. That the features x_1, x_2 doesn't matter. Check reg.coef_ if there are not too small values for features x_1, x_2.



          I changed the toy data and the plot is working.



          import matplotlib.pyplot as plt
          import numpy as np
          from sklearn import linear_model

          # f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
          # r_learn = np.array([6,7,8,9])
          f_learn = np.arange(20.).reshape(5, 4)
          f_learn += np.random.randn(5, 4)
          r_learn = f_learn[:, 0] + 2 * f_learn[:, 1] + 3 * f_learn[:, 2] + 4 * f_learn[:, 3]

          reg = linear_model.LinearRegression()
          reg.fit(f_learn, r_learn)
          print(reg.coef_)

          x_1 = np.linspace(0, 20, 10)
          x_2 = np.linspace(0, 20, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          x_3 = np.full( (10,10), 5).ravel()
          x_4 = np.full( (10,10), 2).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()


          enter image description here






          share|improve this answer













          In your toy data, there are 4 examples with the same feature values and different label. LinearRegression doeasn't learn anything from it. You can check it by:



          >>> reg.coef_
          [0. 0. 0. 0.]


          Maybe it is also the case in your real data. That the features x_1, x_2 doesn't matter. Check reg.coef_ if there are not too small values for features x_1, x_2.



          I changed the toy data and the plot is working.



          import matplotlib.pyplot as plt
          import numpy as np
          from sklearn import linear_model

          # f_learn = np.array([[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]])
          # r_learn = np.array([6,7,8,9])
          f_learn = np.arange(20.).reshape(5, 4)
          f_learn += np.random.randn(5, 4)
          r_learn = f_learn[:, 0] + 2 * f_learn[:, 1] + 3 * f_learn[:, 2] + 4 * f_learn[:, 3]

          reg = linear_model.LinearRegression()
          reg.fit(f_learn, r_learn)
          print(reg.coef_)

          x_1 = np.linspace(0, 20, 10)
          x_2 = np.linspace(0, 20, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          x_3 = np.full( (10,10), 5).ravel()
          x_4 = np.full( (10,10), 2).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()


          enter image description here







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 14 '18 at 15:01









          Tomáš PřindaTomáš Přinda

          32327




          32327












          • Thanky you! I will test this solution. you were very helpful to me, I can already rule out this mistake.

            – Neithard
            Nov 15 '18 at 9:19

















          • Thanky you! I will test this solution. you were very helpful to me, I can already rule out this mistake.

            – Neithard
            Nov 15 '18 at 9:19
















          Thanky you! I will test this solution. you were very helpful to me, I can already rule out this mistake.

          – Neithard
          Nov 15 '18 at 9:19





          Thanky you! I will test this solution. you were very helpful to me, I can already rule out this mistake.

          – Neithard
          Nov 15 '18 at 9:19













          0














          Try something like this. Some comments in the code



          x_1 = np.linspace(1, 100, 100)
          x_2 = np.linspace(1, 100, 100)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          # Why the shape was (1000, 100)?
          x_3 = np.full((100, 100), 5).ravel()
          x_4 = np.full((100, 100), 15).ravel()

          # you should use X_1.ravel() to make it column vector (it is one feature)
          # there was x_3 insted of x_4
          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = clf.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer























          • Hi, sorry I forgot to add revel when i copied the part from my code. I replaced all the variable names with x_ in hope to make it more understandable. i edited my question, as this is unfortunately not the solution. the prediction is still not correct.

            – Neithard
            Nov 14 '18 at 11:44











          • Can you provide some toy data in the example?

            – Tomáš Přinda
            Nov 14 '18 at 13:12











          • added example with toy data, I'm not the best questions writer.

            – Neithard
            Nov 14 '18 at 14:12















          0














          Try something like this. Some comments in the code



          x_1 = np.linspace(1, 100, 100)
          x_2 = np.linspace(1, 100, 100)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          # Why the shape was (1000, 100)?
          x_3 = np.full((100, 100), 5).ravel()
          x_4 = np.full((100, 100), 15).ravel()

          # you should use X_1.ravel() to make it column vector (it is one feature)
          # there was x_3 insted of x_4
          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = clf.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer























          • Hi, sorry I forgot to add revel when i copied the part from my code. I replaced all the variable names with x_ in hope to make it more understandable. i edited my question, as this is unfortunately not the solution. the prediction is still not correct.

            – Neithard
            Nov 14 '18 at 11:44











          • Can you provide some toy data in the example?

            – Tomáš Přinda
            Nov 14 '18 at 13:12











          • added example with toy data, I'm not the best questions writer.

            – Neithard
            Nov 14 '18 at 14:12













          0












          0








          0







          Try something like this. Some comments in the code



          x_1 = np.linspace(1, 100, 100)
          x_2 = np.linspace(1, 100, 100)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          # Why the shape was (1000, 100)?
          x_3 = np.full((100, 100), 5).ravel()
          x_4 = np.full((100, 100), 15).ravel()

          # you should use X_1.ravel() to make it column vector (it is one feature)
          # there was x_3 insted of x_4
          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = clf.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer













          Try something like this. Some comments in the code



          x_1 = np.linspace(1, 100, 100)
          x_2 = np.linspace(1, 100, 100)
          X_1, X_2 = np.meshgrid(x_1, x_2)

          # Why the shape was (1000, 100)?
          x_3 = np.full((100, 100), 5).ravel()
          x_4 = np.full((100, 100), 15).ravel()

          # you should use X_1.ravel() to make it column vector (it is one feature)
          # there was x_3 insted of x_4
          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = clf.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 14 '18 at 11:36









          Tomáš PřindaTomáš Přinda

          32327




          32327












          • Hi, sorry I forgot to add revel when i copied the part from my code. I replaced all the variable names with x_ in hope to make it more understandable. i edited my question, as this is unfortunately not the solution. the prediction is still not correct.

            – Neithard
            Nov 14 '18 at 11:44











          • Can you provide some toy data in the example?

            – Tomáš Přinda
            Nov 14 '18 at 13:12











          • added example with toy data, I'm not the best questions writer.

            – Neithard
            Nov 14 '18 at 14:12

















          • Hi, sorry I forgot to add revel when i copied the part from my code. I replaced all the variable names with x_ in hope to make it more understandable. i edited my question, as this is unfortunately not the solution. the prediction is still not correct.

            – Neithard
            Nov 14 '18 at 11:44











          • Can you provide some toy data in the example?

            – Tomáš Přinda
            Nov 14 '18 at 13:12











          • added example with toy data, I'm not the best questions writer.

            – Neithard
            Nov 14 '18 at 14:12
















          Hi, sorry I forgot to add revel when i copied the part from my code. I replaced all the variable names with x_ in hope to make it more understandable. i edited my question, as this is unfortunately not the solution. the prediction is still not correct.

          – Neithard
          Nov 14 '18 at 11:44





          Hi, sorry I forgot to add revel when i copied the part from my code. I replaced all the variable names with x_ in hope to make it more understandable. i edited my question, as this is unfortunately not the solution. the prediction is still not correct.

          – Neithard
          Nov 14 '18 at 11:44













          Can you provide some toy data in the example?

          – Tomáš Přinda
          Nov 14 '18 at 13:12





          Can you provide some toy data in the example?

          – Tomáš Přinda
          Nov 14 '18 at 13:12













          added example with toy data, I'm not the best questions writer.

          – Neithard
          Nov 14 '18 at 14:12





          added example with toy data, I'm not the best questions writer.

          – Neithard
          Nov 14 '18 at 14:12











          0














          Following code should give you the contour plot you want.



          from sklearn.datasets import make_regression

          f_learn, r_learn = make_regression(20,4)

          reg = linear_model.LinearRegression()
          reg.fit (f_learn, r_learn)

          x_1 = np.linspace(-2, 2, 10)
          x_2 = np.linspace(-2, 2, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)


          x_3 = np.full( (10,10), 0.33).ravel()
          x_4 = np.full( (10,10), 0.99).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer























          • Thank you for you help. As this was not the best question!

            – Neithard
            Nov 15 '18 at 9:20















          0














          Following code should give you the contour plot you want.



          from sklearn.datasets import make_regression

          f_learn, r_learn = make_regression(20,4)

          reg = linear_model.LinearRegression()
          reg.fit (f_learn, r_learn)

          x_1 = np.linspace(-2, 2, 10)
          x_2 = np.linspace(-2, 2, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)


          x_3 = np.full( (10,10), 0.33).ravel()
          x_4 = np.full( (10,10), 0.99).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer























          • Thank you for you help. As this was not the best question!

            – Neithard
            Nov 15 '18 at 9:20













          0












          0








          0







          Following code should give you the contour plot you want.



          from sklearn.datasets import make_regression

          f_learn, r_learn = make_regression(20,4)

          reg = linear_model.LinearRegression()
          reg.fit (f_learn, r_learn)

          x_1 = np.linspace(-2, 2, 10)
          x_2 = np.linspace(-2, 2, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)


          x_3 = np.full( (10,10), 0.33).ravel()
          x_4 = np.full( (10,10), 0.99).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()





          share|improve this answer













          Following code should give you the contour plot you want.



          from sklearn.datasets import make_regression

          f_learn, r_learn = make_regression(20,4)

          reg = linear_model.LinearRegression()
          reg.fit (f_learn, r_learn)

          x_1 = np.linspace(-2, 2, 10)
          x_2 = np.linspace(-2, 2, 10)
          X_1, X_2 = np.meshgrid(x_1, x_2)


          x_3 = np.full( (10,10), 0.33).ravel()
          x_4 = np.full( (10,10), 0.99).ravel()

          predict_matrix = np.vstack([X_1.ravel(), X_2.ravel(), x_3, x_4])
          prediction = reg.predict(predict_matrix.T)

          prediction_plot = prediction.reshape(X_1.shape)

          plt.figure()
          cp = plt.contourf(X_1, X_2, prediction_plot, 10)
          plt.colorbar(cp)
          plt.show()






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 14 '18 at 14:59









          sukhbindersukhbinder

          32733




          32733












          • Thank you for you help. As this was not the best question!

            – Neithard
            Nov 15 '18 at 9:20

















          • Thank you for you help. As this was not the best question!

            – Neithard
            Nov 15 '18 at 9:20
















          Thank you for you help. As this was not the best question!

          – Neithard
          Nov 15 '18 at 9:20





          Thank you for you help. As this was not the best question!

          – Neithard
          Nov 15 '18 at 9:20

















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53296909%2fcontour-plot-for-regression-predict-with-fixed-input-variables%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Top Tejano songwriter Luis Silva dead of heart attack at 64

          政党

          天津地下鉄3号線