Thursday, 9 August 2018

Ordinary Least Squares Regression in Data Analytics

Ordinary Least Squares (OLS)
  • Context:
           → Supervised Learning
  • Derivation of OLS
           → Fit a line of the form y = mx + c  or y = b0 + b1x
          
 → Concept of actual y (𝐲i) and estimated y (ỷi)
          
  → Minimize the squared deviation between actual and estimate.

The Derivation :-


Derivation :-
  • Our goal is to minimize SSE: 
                SSE = ∑ (yi - b0 - b1xi)2
  • We use basic ideas from calculus: Take the first derivative and equate it to 0.

Derivation for b0


Derivation of b1

 

No comments:

Post a Comment

Popular Posts