Ordinary Least Squares is a kind of linear regression models. It is simple and easy to understand. In this tutorial, we will explain it for you to help you understand it.
Ordinary Least Squares is define as:
where y^ is predicted target, x = (x1, x2, …, xn), xn is the n-th feature of sample x. w = (w1, w2, …, wn) is called coefficients, wo is call intercept, w and wo will be estimated by algorithm.
How to estimate w and wo
This is core of Ordinary Least Square, our target is to make y and y^ are as same as possible. to esitmate the discrepancy between them, we difine a loss funciton as:
Notice: we do not use wo in this loss function.
The more mininum of loss function, the more minimum of discrepancy between y and y^. This is Ordinary Least Squares.
How to measure the quality of w and wo
We can compute the R2 coefficient between y and y^.
An Introduction to R2 Coefficient for Beginners – Scikit-Learn Tutorial