Support Vector Regression

The following two tabs change content below.
I am a technology enthusiast and always up for challenges. Recently I have started getting hands dirty in machine learning using python and aspiring to gather everything I can.

Latest posts by Renuka Joshi (see all)

Hello all!We have learnt about three different forms of regression in previous tutorials.Now we will learn yet another interesting form of regression i.e. Support Vector Regression(SVR).

Support Vector Regression:

We will perform SVR on same example that we have used in polynomial regression.But,before that we need to apply feature scaling to our dataset because SVR libraries do not apply feature scaling by default.Feature scaling is a process of transforming variables to have values in same range so that no variable is dominated by the other.

  • dataset.iloc[:,1:2].values and dataset.iloc[:,2:3] creates x and y as matrices instead of vectors.
  • Here,in our example as the value of y increases significantly as x increases.So,values in y dominates values in x and hence we need to scale the data to implement SVR.
  • StandardScalar() is the class used for feature scaling from sklearn library.
  • sc_x.fit_transform(X) will transform and fit X into scaled data
  • sc_y.fit_transform(y) will transform and fit y into scaled data

 

Scaled data is as follows.

 


After scaling the data we will fit it into SVR as follows:

Here,

  • Import SVR class from sklearn library
  • create regressor SVR class object with kernel ‘rbf’ because this follows Gaussian process
  • fit scaled X and y to the object regressor
  • y_pred will be predicted salary for newly joined employee with 8.3 level
  • np.array will create a 2d array consisting only single column
  • sc_y.inverse_transform will convert scaled value of salary into actual salary

Let us simply plot the graph for Support Vector Regression now.

Note the graph shown beside. SVR graph has predicted the results with scaled data. Note that last observation is left unconsidered, well that is because SVR felt the observation was too far from actual observation to be taken into account.

Note that predicted salary is 203700, which is close to what employee asked for i.e. 190000.

I hope this article helped understand Support Vector Regression.

 

 

References:

Share Button

Renuka Joshi

I am a technology enthusiast and always up for challenges. Recently I have started getting hands dirty in machine learning using python and aspiring to gather everything I can.

4 thoughts on “Support Vector Regression

  • Pingback:Decision Tree Regression - theJavaGeek

  • August 18, 2018 at 11:50 pm
    Permalink

    You are literally just posting things of Machine Learning A-Z course available on udemy!

    Reply
    • August 23, 2018 at 9:55 am
      Permalink

      Hi Fuck Off, nice name.

      Yes, most of my articles are heavily based on Udemy A to Z Machine Learning course and I have tried to explain them in simpler way. In case of this particular article, it looks exactly copy of the program but there is a small difference, you must have found it.
      I have given reference to Udemy course in each article, you must have noticed with your observational skills
      I am in learning phase and sharing whatever I have learned. You can choose not to learn from here, there is no obligation.

      Reply
  • September 1, 2018 at 1:43 am
    Permalink

    Well said Renuka! That fellow is a clot.

    thanks for the posting. I was wondering about scaling my features for a time series rbf predictor. I had done it manually in a c# app I wrote, but your version keeps the scale factor so getting the ‘real’ value at the end is easy. I have to sift off the non important features with either a random forest or an svr (or try both against each other) recursive feature elimination, but eventually I get down to the predict phase.

    Once again, thanks for the clear example.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *