Wednesday, May 1, 2013

Tries to Derive an Equation (eventual success!) Simple Linear Regression Least Squares Estimates for β0 and β1

If you take any somewhat rigorous introductory statistics course, you will probably encounter a variety of regression models. Basically they work like this. Assume X and Y are both random variables. Use the theoretical probability functions to find the graph of μ Y|x, which is the mean value of Y given that X has assumed value of x. 


Simple Linear Regression has an important distinction, X is assumed to be not random. Because of this assumption, the graph of the function is the curve of regression of Y on X, or Y as a function of X. In other words, Y is the dependent variable and X is the independent variable. 


Yiβ0β1x + Ei


To approximate the β values, the Least Squares Method is employed. We take the sum of squares of the errors (SSE) and differentiate with respect to each β. Everything is then rearranged to solve for the βs. I have posted my derivation that I worked out by hand.* Enjoy!  







*If you can't read my work this derivation can easily be found online. 




No comments:

Post a Comment

Share whatever you would like to!