Exam Details
Subject | regression analysis | |
Paper | ||
Exam / Course | m.sc. (statistics) | |
Department | ||
Organization | solapur university | |
Position | ||
Exam Date | November, 2018 | |
City, State | maharashtra, solapur |
Question Paper
M.Sc. (Semester III) (CBCS) Examination Nov/Dec-2018
Statistics
REGRESSION ANALYSIS
Time: 2½ Hours Max. Marks: 70
Instructions: All Questions carry equal marks.
Figures to the right indicate full marks.
Q.1 Choose the correct alternative: 14
The linear parametric function is estimable if and only is
belongs to column space of coefficient matrix
Response vector belongs to column space of coefficient matrix
The coefficient matrix is non-full rank
is unique
The covariance between any linear function belong to error space and any
BLUE is
1 0
None of these M
Autocorrelation is concerned with
Correlation among regressors
Correlation among error terms
Correlation among response and regressor variables
None of these
The variance of PRESS residual is
Backward elimination procedure begins with the assumption that
No regressors in the model
Some regressors in the model
All regressors in the model
None of these
The sum of residuals in any regression model with intercept is always
Positive Zero
Non-zero One
In classical linear regression, the distribution of response variable is
Uniform Normal
Poisson Binomial
In simple linear regression model × and respectively are
Response variable and regressor variable
Response variable and predictor variable
Predictor variable and response variable
Slope and intercept
Page 2 of 3
SLR-VR-487
Which of the following model can be linearized by reciprocal transformation?
All the above
10) The joint points of pieces in polynomial fitting are usually called
Residuals Knots
Errors None of these
11) The limitation of Durbin-Watson test for autocorrelation is that
It is large sample test
It has inconclusive range
I is computationally complex
It has low power
12) If we use unit length scaling for regressors, then matrix of scaling
regressors will be in the form of a matrix.
Skew symmetric Covariance
Correlation None of these
13) The difference between observed value and the corresponding fitted value
is called.
Slope Intercept
Error Residual
14) Variance stabilizing transformation is used when distribution of is
Poisson Binomial
Normal Uniform
Q.2 Attempt any four of the following. 08
In multiple linear regression model, show that var
Prove that is the square of correlation between and its predicted
value .
Describe variable selection problem.
Define standardized residuals and explain its role in regression analysis.
State any two necessary and sufficient conditions for estimability of
linear parametric function.
Write short notes on following. (Any two) 06
Locally weighted regression
Normal probability plot
Variance inflation factor
Q.3 Answer any two of the following. 08
Show that any solution to the normal equations minimizes the residual
sum of squares.
In usual notations, outline the procedure of testing an individual
regression coefficient.
Discuss sources of autocorrelation in regression.
Attempt any one of the following. 06
Explain how forward selection method is used for variable selection in
regression.
Discuss Box-cox power transformation. Explain the procedure of
computing the parameter of power transformation.
Page 3 of 3
SLR-VR-487
Q.4 Answer any two of the following. 10
Discuss Durbin-Watson test for detecting autocorrelation? What are its
limitations?
In usual notations, outline the procedure of testing a general linear
hypothesis 0.
Define Mallow's Cp statistic and derive the same.
Attempt any one of the following. 04
Describe orthogonal polynomial to fit the polynomial model in one
variable.
Describe the detection of multi co-linearity using correlation matrix.
Q.5 Attempt any two of the following. 14
State the multiple linear regression model. Derive the least squares
estimator of regression parameters in the model.
Discuss linearization method of parameter estimation in non-linear
regression.
Explain residual plots and indicate their use in respect of
Fitness of model
ii) Normality
Statistics
REGRESSION ANALYSIS
Time: 2½ Hours Max. Marks: 70
Instructions: All Questions carry equal marks.
Figures to the right indicate full marks.
Q.1 Choose the correct alternative: 14
The linear parametric function is estimable if and only is
belongs to column space of coefficient matrix
Response vector belongs to column space of coefficient matrix
The coefficient matrix is non-full rank
is unique
The covariance between any linear function belong to error space and any
BLUE is
1 0
None of these M
Autocorrelation is concerned with
Correlation among regressors
Correlation among error terms
Correlation among response and regressor variables
None of these
The variance of PRESS residual is
Backward elimination procedure begins with the assumption that
No regressors in the model
Some regressors in the model
All regressors in the model
None of these
The sum of residuals in any regression model with intercept is always
Positive Zero
Non-zero One
In classical linear regression, the distribution of response variable is
Uniform Normal
Poisson Binomial
In simple linear regression model × and respectively are
Response variable and regressor variable
Response variable and predictor variable
Predictor variable and response variable
Slope and intercept
Page 2 of 3
SLR-VR-487
Which of the following model can be linearized by reciprocal transformation?
All the above
10) The joint points of pieces in polynomial fitting are usually called
Residuals Knots
Errors None of these
11) The limitation of Durbin-Watson test for autocorrelation is that
It is large sample test
It has inconclusive range
I is computationally complex
It has low power
12) If we use unit length scaling for regressors, then matrix of scaling
regressors will be in the form of a matrix.
Skew symmetric Covariance
Correlation None of these
13) The difference between observed value and the corresponding fitted value
is called.
Slope Intercept
Error Residual
14) Variance stabilizing transformation is used when distribution of is
Poisson Binomial
Normal Uniform
Q.2 Attempt any four of the following. 08
In multiple linear regression model, show that var
Prove that is the square of correlation between and its predicted
value .
Describe variable selection problem.
Define standardized residuals and explain its role in regression analysis.
State any two necessary and sufficient conditions for estimability of
linear parametric function.
Write short notes on following. (Any two) 06
Locally weighted regression
Normal probability plot
Variance inflation factor
Q.3 Answer any two of the following. 08
Show that any solution to the normal equations minimizes the residual
sum of squares.
In usual notations, outline the procedure of testing an individual
regression coefficient.
Discuss sources of autocorrelation in regression.
Attempt any one of the following. 06
Explain how forward selection method is used for variable selection in
regression.
Discuss Box-cox power transformation. Explain the procedure of
computing the parameter of power transformation.
Page 3 of 3
SLR-VR-487
Q.4 Answer any two of the following. 10
Discuss Durbin-Watson test for detecting autocorrelation? What are its
limitations?
In usual notations, outline the procedure of testing a general linear
hypothesis 0.
Define Mallow's Cp statistic and derive the same.
Attempt any one of the following. 04
Describe orthogonal polynomial to fit the polynomial model in one
variable.
Describe the detection of multi co-linearity using correlation matrix.
Q.5 Attempt any two of the following. 14
State the multiple linear regression model. Derive the least squares
estimator of regression parameters in the model.
Discuss linearization method of parameter estimation in non-linear
regression.
Explain residual plots and indicate their use in respect of
Fitness of model
ii) Normality
Other Question Papers
Subjects
- asymptotic inference
- clinical trials
- discrete data analysis
- distribution theory
- estimation theory
- industrial statistics
- linear algebra
- linear models
- multivariate analysis
- optimization techniques
- planning and analysis of industrial experiments
- probability theory
- real analysis
- regression analysis
- reliability and survival analysis
- sampling theory
- statistical computing
- statistical methods (oet)
- stochastic processes
- theory of testing of hypotheses
- time series analysis