Statistics 101: Linear Regression, Understanding Model Error


, , , , , , , , , , , , , , , , , , , ,

Post navigation

19 thoughts on “Statistics 101: Linear Regression, Understanding Model Error

  1. what does this significance value tell us? I mean the significant difference is between what? Also, what is adjusted R square? What is the difference between R-squared and adjusted R-squared

  2. nice video! I actually find the R-square (coefficient of determination from your other video ) =74.93% , where the correlation r= .866 is actually square root of "R-square". is that a coincidence ?! the correlation of simple linear regression is actually square root of SSR/SST!

  3. @13:08 you are mentioning degrees of freedom as 2.. should'nt it be 1?? The Anova Table @5:18 shows 1 as degree of freedom for the model and 4(n-p-1 = 6-2 ) as degree's of freedom for errors..

  4. I really miss the motivation you used to give at the start of every video. Please include that motivation in every video lecture.

  5. I have watched all your playlist from 1 to this one and will finish the remaining. I have learned more, with great depth and understanding of fundamentals, in one month with your videos, than what my MBA program taught me about data science in 2 years.

  6. What is F and Significance F??

    The videos are great but it all falls apart when you assume that knowledge. Are we supposed to have watched all previous 13 playlist in full?

  7. I'm confused. Residuals have always been explained to be the difference between observed value to the predicted value. Here you say it's the observed value to the mean. SST = SSR + SSE, in which the SSR is the one that looks at the squared sum of residuals.

Leave a Reply

Your email address will not be published. Required fields are marked *