Filters
Question type

(Requires Appendix material) In deriving the OLS estimator, you minimize the sum of squared residuals with respect to the two parameters β^0 and β^1\hat { \beta } _ { 0 } \text { and } \hat { \beta } _ { 1 } . The resulting two equations imply two restrictions that OLS places on the data, namely that i=1nu^i=0\sum _ { i = 1 } ^ { n } \hat { u } _ { i } = 0 and i=1nu^iXi=0\sum _ { i = 1 } ^ { n } \hat { u } _ { i } X _ { i } = 0 Show that you get the same formula for the regression slope and the intercept if you impose these two conditions on the sample regression function.

Correct Answer

verifed

verified

The OLS estimator is derived by


A) connecting the Yi corresponding to the lowest Xi observation with the Yi corresponding to the highest Xi observation.
B) making sure that the standard error of the regression equals the standard error of the slope estimator.
C) minimizing the sum of absolute residuals.
D) minimizing the sum of squared residuals.

Correct Answer

verifed

verified

To decide whether or not the slope coefficient is large or small,


A) you should analyze the economic importance of a given increase in X.
B) the slope coefficient must be larger than one.
C) the slope coefficient must be statistically significant.
D) you should change the scale of the X variable if the coefficient appears to be too small.

Correct Answer

verifed

verified

For the simple regression model of Chapter 4, you have been given the following data: i=1420Yi=274,745.75;i=1420Xi=8,248.979;i=1420XiYi=5,392,705;i=1420Xi2=163,513.03;i=1420Yi2=179,878,841.13\begin{array} { c } \sum _ { i = 1 } ^ { 420 } Y _ { i } = 274,745.75 ; \sum _ { i = 1 } ^ { 420 } X _ { i } = 8,248.979 ; \\\\\sum _ { i = 1 } ^ { 420 } X _ { i } Y _ { i } = 5,392,705 ; \sum _ { i = 1 } ^ { 420 } X _ { i } ^ { 2 } = 163,513.03 ; \sum _ { i = 1 } ^ { 420 } Y _ { i } ^ { 2 } = 179,878,841.13\end{array} (a)Calculate the regression slope and the intercept.

Correct Answer

verifed

verified

The slope estimator, β1\beta _ { 1 } has a smaller standard error, other things equal, if


A) there is more variation in the explanatory variable, X .
B) there is a large variance of the error term, u .
C) the sample size is smaller.
D)  the intercept, β0, is small. \text { the intercept, } \beta _ { 0 } \text {, is small. }

Correct Answer

verifed

verified

Interpreting the intercept in a sample regression function is


A) not reasonable because you never observe values of the explanatory variables around the origin.
B) reasonable because under certain conditions the estimator is BLUE.
C) reasonable if your sample contains values of Xi around the origin.
D) not reasonable because economists are interested in the effect of a change in X on the change in Y.

Correct Answer

verifed

verified

Binary variables


A) are generally used to control for outliers in your sample.
B) can take on more than two values.
C) exclude certain individuals from your sample.
D) can take on only two values.

Correct Answer

verifed

verified

(Requires Appendix material) A necessary and sufficient condition to derive the OLS estimator is that the following two conditions hold: i=1nu^i=0 and i=1nu^iXi=0\sum _ { i = 1 } ^ { n } \hat { u } _ { i } = 0 \text { and } \sum _ { i = 1 } ^ { n } \hat { u } _ { i } X _ { i } = 0 Show that these conditions imply that i=1nu^iY^i=0\sum _ { i = 1 } ^ { n } \hat { u } _ { i } \hat { Y } _ { i } = 0

Correct Answer

verifed

verified

In order to calculate the slope, the intercept, and the regression R2R ^ { 2 } for a simple sample regression function, list the five sums of data that you need.

Correct Answer

verifed

verified

Depending whether or not the data is in ...

View Answer

In the linear regression model, Yi=β0+β1Xi+ui,β0+β1XiY _ { i } = \beta _ { 0 } + \beta _ { 1 } X _ { i } + u _ { i } , \beta _ { 0 } + \beta _ { 1 } X _ { i } is referred to as


A) the population regression function.
B) the sample regression function.
C) exogenous variation.
D) the right-hand variable or regressor.

Correct Answer

verifed

verified

The help function for a commonly used spreadsheet program gives the following definition for the regression slope it estimates: ni=1nXiYi(i=1nXi)(i=1nYi)ni=1nXi2(i=1nXi)2\frac { n \sum _ { i = 1 } ^ { n } X _ { i } Y _ { i } - \left( \sum _ { i = 1 } ^ { n } X _ { i } \right) \left( \sum _ { i = 1 } ^ { n } Y _ { i } \right) } { n \sum _ { i = 1 } ^ { n } X _ { i } ^ { 2 } - \left( \sum _ { i = 1 } ^ { n } X _ { i } \right) ^ { 2 } } Prove that this formula is the same as the one given in the textbook.

Correct Answer

verifed

verified

blured image both numerator and ...

View Answer

The baseball team nearest to your home town is, once again, not doing well.Given that your knowledge of what it takes to win in baseball is vastly superior to that of management, you want to find out what it takes to win in Major League Baseball (MLB). You therefore collect the winning percentage of all 30 baseball teams in MLB for 1999 and regress the winning percentage on what you consider the primary determinant for wins, which is quality pitching (team earned run average).You find the following information on team performance: The baseball team nearest to your home town is, once again, not doing well.Given that your knowledge of what it takes to win in baseball is vastly superior to that of management, you want to find out what it takes to win in Major League Baseball (MLB). You therefore collect the winning percentage of all 30 baseball teams in MLB for 1999 and regress the winning percentage on what you consider the primary determinant for wins, which is quality pitching (team earned run average).You find the following information on team performance:   (a)What is your expected sign for the regression slope? Will it make sense to interpret the intercept? If not, should you omit it from your regression and force the regression line through the origin? (a)What is your expected sign for the regression slope? Will it make sense to interpret the intercept? If not, should you omit it from your regression and force the regression line through the origin?

Correct Answer

verifed

verified

Answers to the second question...

View Answer

The regression R2R ^ { 2 } is defined as follows:


A) ESSTSS\frac{E S S}{T S S}

B) RSSTSS\frac { R S S } { T S S }
C) i=1n(YiYˉ) (XiXˉ) i=1n(YiYˉ) 2i=1n(XiXˉ) 2\frac { \sum _ { i = 1 } ^ { n } \left( Y _ { i } - \bar { Y } \right) \left( X _ { i } - \bar { X } \right) } { \sqrt { \sum _ { i = 1 } ^ { n } \left( Y _ { i } - \bar { Y } \right) ^ { 2 } } \sqrt { \sum _ { i = 1 } ^ { n } \left( X _ { i } - \bar { X } \right) ^ { 2 } } }
D) SSRn2\frac { S S R } { n - 2 }

Correct Answer

verifed

verified

Prove that the regression R2R ^ { 2 } is identical to the square of the correlation coefficient between two variables Y and X . Regression functions are written in a form that suggests causation running from X to Y . Given your proof, does a high regression R2R ^ { 2 } present supportive evidence of a causal relationship? Can you think of some regression examples where the direction of causality is not clear? Is without a doubt?

Correct Answer

verifed

verified

11ec79c2_af85_caee_98a1_b1db17074209_TB3...

View Answer

(Requires Calculus)Consider the following model: Yi=β0+ui.Y _ { i } = \beta _ { 0 } + u _ { i } . Derive the OLS estimator for β0\beta _ { 0 } .

Correct Answer

verifed

verified

(Requires Appendix material)Consider the sample regression function Yi=γ^0+γ^1Xi+u^iY _ { i } ^ { * } = \hat { \gamma } _ { 0 } + \hat { \gamma } _ { 1 } X _ { i } ^ { * } + \hat { u } _ { i } where "*" indicates that the variable has been standardized. What are the units of measurement for the dependent and explanatory variable? Why would you want to transform both variables in this way? Show that the OLS estimator for the intercept equals zero. Next prove that the OLS estimator for the slope in this case is identical to the formula for the least squares estimator where the variables have not been standardized, times the ratio of the sample standard deviation of XX and YY , i.e., γ^1=β^1sXsY\hat { \gamma } _ { 1 } = \hat { \beta } _ { 1 } * \frac { s _ { X } } { s _ { Y } } .

Correct Answer

verifed

verified

The units of measurement are in standard...

View Answer

(Requires Appendix material)At a recent county fair, you observed that at one stand people's weight was forecasted, and were surprised by the accuracy (within a range). Thinking about how the person could have predicted your weight fairly accurately (despite the fact that she did not know about your "heavy bones"), you think about how this could have been accomplished.You remember that medical charts for children contain 5%, 25%, 50%, 75% and 95% lines for a weight/height relationship and decide to conduct an experiment with 110 of your peers.You collect the data and calculate the following sums: i=1nYi=17,375,i=1nXi=7,665.5i=1nyi2=94,228.8,i=1nxi2=1,248.9,i=1nxiyi=7,625.9\begin{array} { c } \sum _ { i = 1 } ^ { n } Y _ { i } = 17,375 , \quad \sum _ { i = 1 } ^ { n } X _ { i } = 7,665.5 \\\\\sum _ { i = 1 } ^ { n } y _ { i } ^ { 2 } = 94,228.8 , \sum _ { i = 1 } ^ { n } x _ { i } ^ { 2 } = 1,248.9 , \quad \sum _ { i = 1 } ^ { n } x _ { i } y _ { i } = 7,625.9\end{array} where the height is measured in inches and weight in pounds. (Small letters refer to deviations from means as in zi=ZiZˉz _ { i } = Z _ { i } - \bar { Z } .) (a)Calculate the slope and intercept of the regression and interpret these.

Correct Answer

verifed

verified

The OLS slope estimator is not defined if there is no variation in the data for the explanatory variable.You are interested in estimating a regression relating earnings to years of schooling.Imagine that you had collected data on earnings for different individuals, but that all these individuals had completed a college education (16 years of education).Sketch what the data would look like and explain intuitively why the OLS coefficient does not exist in this situation.

Correct Answer

verifed

verified

There is no variation in X in this case,...

View Answer

You have analyzed the relationship between the weight and height of individuals. Although you are quite confident about the accuracy of your measurements, you feel that some of the observations are extreme, say, two standard deviations above and below the mean.Your therefore decide to disregard these individuals.What consequence will this have on the standard deviation of the OLS estimator of the slope?

Correct Answer

verifed

verified

Other things being equal, the standard e...

View Answer

Indicate in a scatterplot what the data for your dependent variable and your explanatory variable would look like in a regression with an R2 equal to zero. How would this change if the regression R2 was equal to one?

Correct Answer

verifed

verified

For the zero regression 11ec79c6_55c9_b2...

View Answer

Showing 21 - 40 of 54

Related Exams

Show Answer