Exam Details
Subject | regression analysis | |
Paper | ||
Exam / Course | m.sc. (statistics) | |
Department | ||
Organization | solapur university | |
Position | ||
Exam Date | 27, April, 2017 | |
City, State | maharashtra, solapur |
Question Paper
M.Sc.(Statistics) (Semester III) (CBCS) Examination, 2017
REGRESSION ANALYSIS
Day Date: Thursday, 27-04-2017 Max. Marks: 70
Time: 2:30 PM to 05.00 PM
N.B. Attempt five questions.
Q. No. and Q. No are compulsory.
Attempt any three from Q. No. to Q. No.
Figures to the right indicate full marks.
Q.1 Choose correct alternative: 05
Backward elimination procedure begins with the assumption
that
same regressors in the model
no regressors in the model
all regressors in the model
none of these
2 The standardized PRESS residual is
√
√
√
3 In simple linear regressions model,
coefficients and are respectively.
slope and intercept intercept and slope
error and slope intercept and error
4 Autocorrelation is concerned with
Correlation among regressor variables.
Correlation among response and regressor variables.
Correlation among disturbance terms.
Correlation between disturbance
5 To test significance of an individual regression coefficient in
multiple linear regression model is used.
F-test t-test
x2 test z test
Fill in the blanks. 05
In multiple linear regression model of is
The model can be linearized using
transformation.
In usual notations
The difference between the observed value and
corresponding fitted value of response variable is called
In piece wise polynomial fitting the joint points of pieces are
usually
Page 2 of 2
State the following sentence are True or False: 04
space outlier can be residual outlier.
Choice of model is a source of multicollinearity in regression.
In regression model errors are correlated.
Variance inflation factors are useful in detecting
autocorrelation.
Q.2 Define: 06
Coefficient of determination
Hat matrix
Ordinary residual
Write short notes on the following: 08
Linearization methods in nonlinear regression.
Durbin- Watson test
Q.3 State multiple linear regression model. Derive LSE for regression
parameters in the model, obtain variance of LSE.
07
Defined studentized residuals, explain residual plots and indicate
their use in respect of fitness of model normality.
07
Q.4 Discuss confidence interval for regression coefficients and
prediction interval for future observation in the context of multiple
linear regression.
07
For multiple linear regression model define residual sum of
squares and show that RSS=
07
Q.5 Describe variable selection problem. Explain how forward
selection method is used for variable selection in regression.
07
Discuss Box-cox power transformation. Explain the procedure of
computing the parameter of power transformation.
07
Q.6 Describe cochrane-orcutt method of parameter estimation in
presence of autocorrelation.
07
Explain multicollinearity in the connection of linear regression.
Describe the method of detection of multicollinearity based on
eigen values of matrix .
07
Q.7 Explain the nonlinear regression model. Discuss LSE method of
parameter estimation in this model.
REGRESSION ANALYSIS
Day Date: Thursday, 27-04-2017 Max. Marks: 70
Time: 2:30 PM to 05.00 PM
N.B. Attempt five questions.
Q. No. and Q. No are compulsory.
Attempt any three from Q. No. to Q. No.
Figures to the right indicate full marks.
Q.1 Choose correct alternative: 05
Backward elimination procedure begins with the assumption
that
same regressors in the model
no regressors in the model
all regressors in the model
none of these
2 The standardized PRESS residual is
√
√
√
3 In simple linear regressions model,
coefficients and are respectively.
slope and intercept intercept and slope
error and slope intercept and error
4 Autocorrelation is concerned with
Correlation among regressor variables.
Correlation among response and regressor variables.
Correlation among disturbance terms.
Correlation between disturbance
5 To test significance of an individual regression coefficient in
multiple linear regression model is used.
F-test t-test
x2 test z test
Fill in the blanks. 05
In multiple linear regression model of is
The model can be linearized using
transformation.
In usual notations
The difference between the observed value and
corresponding fitted value of response variable is called
In piece wise polynomial fitting the joint points of pieces are
usually
Page 2 of 2
State the following sentence are True or False: 04
space outlier can be residual outlier.
Choice of model is a source of multicollinearity in regression.
In regression model errors are correlated.
Variance inflation factors are useful in detecting
autocorrelation.
Q.2 Define: 06
Coefficient of determination
Hat matrix
Ordinary residual
Write short notes on the following: 08
Linearization methods in nonlinear regression.
Durbin- Watson test
Q.3 State multiple linear regression model. Derive LSE for regression
parameters in the model, obtain variance of LSE.
07
Defined studentized residuals, explain residual plots and indicate
their use in respect of fitness of model normality.
07
Q.4 Discuss confidence interval for regression coefficients and
prediction interval for future observation in the context of multiple
linear regression.
07
For multiple linear regression model define residual sum of
squares and show that RSS=
07
Q.5 Describe variable selection problem. Explain how forward
selection method is used for variable selection in regression.
07
Discuss Box-cox power transformation. Explain the procedure of
computing the parameter of power transformation.
07
Q.6 Describe cochrane-orcutt method of parameter estimation in
presence of autocorrelation.
07
Explain multicollinearity in the connection of linear regression.
Describe the method of detection of multicollinearity based on
eigen values of matrix .
07
Q.7 Explain the nonlinear regression model. Discuss LSE method of
parameter estimation in this model.
Other Question Papers
Subjects
- asymptotic inference
- clinical trials
- discrete data analysis
- distribution theory
- estimation theory
- industrial statistics
- linear algebra
- linear models
- multivariate analysis
- optimization techniques
- planning and analysis of industrial experiments
- probability theory
- real analysis
- regression analysis
- reliability and survival analysis
- sampling theory
- statistical computing
- statistical methods (oet)
- stochastic processes
- theory of testing of hypotheses
- time series analysis