Exam Details
Subject | regression analysis | |
Paper | ||
Exam / Course | m.sc. (statistics) | |
Department | ||
Organization | solapur university | |
Position | ||
Exam Date | November, 2017 | |
City, State | maharashtra, solapur |
Question Paper
M.Sc. (Semester III) (CBCS) Examination Oct/Nov-2017
Statistics
REGRESSION ANALYSIS
Day Date: Thursday, 23-11-2017 Max. Marks: 70
Time: 02.30 PM to 05.00 PM
Instructions: Q.1 and Q.2 are compulsory.
Attempt five questions.
Attempt any three questions from Q. 3 to 7.
Figures to the right indicate full marks.
Q.1 Select the correct alternative: 05
The estimate of in the regression model by by method
of least square is
Biased Unbiased
Inconsistent None of these
Autocorrelation is concerned with
Correlation among the predictors
Correlation among the error terms
Correlation among response and predictors
None of these
The sum of residuals in any regression model that contains an intercept
is always
Zero One
Greater than zero None of these
The matrix − is
Symmetric Idempotent
Both and None of these
Fill in the blanks: 05
The transformation is suitable for linearizing the function
is
For general full rank model the term −
Is a polynomial regression model in
variables.
The conditional number of matrix is defined as
Quadratic form in normal variables and are independently
distributed if and only if
State true and false 04
If are iid and … then is distributed as
Chi-square with degrees of freedom equal to rank provided
In a full rank Gauss-Markov model every linear parametric functions is
estimable.
Residuals are useful in detecting outliers.
Hat matrix H is skew-symmetric.
Page 2 of 2
SLR-MS-657
Q.2 Explain error space and estimation space. 03
What is cubic spline? 03
Explain the concept of ridge regression. 04
Discuss Box-Cox power transformation. 04
Q.3 State and prove Gauss-Markov theorem. 07
If are independent random variables such that
and . State with justification, which of the
following functions are estimable?
− ii) iii)
07
Q.4 State multiple linear regression model with assumptions. In usual notations,
show that
07
Describe the best procedure for testing 0 in the context of multiple
linear regression.
07
Q.5 What is variable selection problem? Describe forward selection method in
the context of variable selection.
07
Discuss the following multicollinearity detection methods. 07
Variance inflation factor (VIF).
Examination of correlation matrix.
Q.6 Explain the non-linear regression model. Discuss non-linear squares
method for parameter estimation.
07
−
Explain locally weighted regression. In usual notations, show that 07
Q.7 Explain the following plots: 07
Normal probability plot.
Residual against the fitted values.
Discuss Durbin-Watson test for detecting autocorrelation. What are its
limitations?
Statistics
REGRESSION ANALYSIS
Day Date: Thursday, 23-11-2017 Max. Marks: 70
Time: 02.30 PM to 05.00 PM
Instructions: Q.1 and Q.2 are compulsory.
Attempt five questions.
Attempt any three questions from Q. 3 to 7.
Figures to the right indicate full marks.
Q.1 Select the correct alternative: 05
The estimate of in the regression model by by method
of least square is
Biased Unbiased
Inconsistent None of these
Autocorrelation is concerned with
Correlation among the predictors
Correlation among the error terms
Correlation among response and predictors
None of these
The sum of residuals in any regression model that contains an intercept
is always
Zero One
Greater than zero None of these
The matrix − is
Symmetric Idempotent
Both and None of these
Fill in the blanks: 05
The transformation is suitable for linearizing the function
is
For general full rank model the term −
Is a polynomial regression model in
variables.
The conditional number of matrix is defined as
Quadratic form in normal variables and are independently
distributed if and only if
State true and false 04
If are iid and … then is distributed as
Chi-square with degrees of freedom equal to rank provided
In a full rank Gauss-Markov model every linear parametric functions is
estimable.
Residuals are useful in detecting outliers.
Hat matrix H is skew-symmetric.
Page 2 of 2
SLR-MS-657
Q.2 Explain error space and estimation space. 03
What is cubic spline? 03
Explain the concept of ridge regression. 04
Discuss Box-Cox power transformation. 04
Q.3 State and prove Gauss-Markov theorem. 07
If are independent random variables such that
and . State with justification, which of the
following functions are estimable?
− ii) iii)
07
Q.4 State multiple linear regression model with assumptions. In usual notations,
show that
07
Describe the best procedure for testing 0 in the context of multiple
linear regression.
07
Q.5 What is variable selection problem? Describe forward selection method in
the context of variable selection.
07
Discuss the following multicollinearity detection methods. 07
Variance inflation factor (VIF).
Examination of correlation matrix.
Q.6 Explain the non-linear regression model. Discuss non-linear squares
method for parameter estimation.
07
−
Explain locally weighted regression. In usual notations, show that 07
Q.7 Explain the following plots: 07
Normal probability plot.
Residual against the fitted values.
Discuss Durbin-Watson test for detecting autocorrelation. What are its
limitations?
Other Question Papers
Subjects
- asymptotic inference
- clinical trials
- discrete data analysis
- distribution theory
- estimation theory
- industrial statistics
- linear algebra
- linear models
- multivariate analysis
- optimization techniques
- planning and analysis of industrial experiments
- probability theory
- real analysis
- regression analysis
- reliability and survival analysis
- sampling theory
- statistical computing
- statistical methods (oet)
- stochastic processes
- theory of testing of hypotheses
- time series analysis