Suppose that we examine the relationship between high school GPA and college GPA. We collect data from students at a local college and find that there is a strong, positive, linear association between the variables. The linear regression predicted college GPA = 1.07 + 0.62 * high school GPA. The standard error of the regression, se, was 0.374. What does this value of the standard error of the regression tell us?

Respuesta :

Answer:

The typical error between a predicted college GPA using this regression model and an actual college GPA for a given student will be about 0.374 grade points in size (absolute value).

Step-by-step explanation:

The linear regression line for College GPA based on High school GPA is:

College GPA = 1.07 + 0.62 High-school GPA

It is provided that the standard error of the regression line is,

[tex]s_{e}=0.374[/tex]

The standard error of a regression line is the average distance between the predicted value and the regression equation.

It is the square root of the average squared deviations.

It is also known as the standard error of estimate.

The standard error of 0.374 implies that:

The typical error between a predicted college GPA using this regression model and an actual college GPA for a given student will be about 0.374 grade points in size (absolute value).