ML Algorithm Simplified — Simple Linear Regression

Suhas Mahindrakar
3 min readOct 10, 2020

Everyone knows ‘Straight Line’ — So , We are Half Done.

Straight Line

Just take an Example — As we know, Ice Cream Sales increases as Temperature rises. So, that means we can say there is some relationship between Ice Cream Sales and Temperature. And , basis which , we can easily estimate Ice Cream Sales ( Y ) if we know Temperature ( X ).

And , If we plot a Graph , We see the Trend.

Ice Cream Sales vs Temperature

Normally, there are 3 kinds of Trends we found basis relationship between Two Variables — Positive , Negative and No Trend.

Trend

To draw ‘Best Fit Line’ which passes through all the Data Points in such a way that — Vertical Distance between Each Data Point and the Straight Line is minimum.

For this , We use Simple Formula — [ Y = mX + b ] where ( Y ) is Output Variable and ( X ) is Input Variable. ( m ) is Slope of Line and ( b ) is Intercept on Y-axis.

Formula to draw Best Fit Line
‘Blue Line’ is Best Fit Line
Formula to Calculate Slope of Line ( m )
Y - Intercept is the Point at which Straight Line crosses Y-axis
Residuals

Now, lets see what is Residuals , It is just the Vertical Distance between Data Point and the Straight Line.

But, what we want is ‘Best Fit Line’ and not just ‘Straight Line’ and this can be obtained by adjusting Slope and Intercept of Line in such a way that Sum of Square of all Residuals is as Low as Possible. In Statistics , we call this Method as ‘Least Squares Method’.

What we learned , so far , is the basis for ‘Linear Regression’ Algorithm.

In Machine Learning, Model uses this ‘Linear Regression’ Algorithm , to predict Values based on Input Values we pass as Observations. Model uses this Algorithm to learn relationship between Output ( Y ) and Input ( X ). That means , initially , We pass Labels ( Y ) along with Input ( X ) . Therefore, We call this ‘Supervised Learning’.

For Practical Implementation , We use Scikit-Learn Library in Python which is not in the scope of this Article. It will be covered in separate Article.

Thank You.

--

--