Exam Details
Subject | estimation theory | |
Paper | ||
Exam / Course | m.sc. (statistics) | |
Department | ||
Organization | solapur university | |
Position | ||
Exam Date | 25, April, 2017 | |
City, State | maharashtra, solapur |
Question Paper
M.Sc. (Statistics) (Semester (CBCS) Examination, 2017
ESTIMATION THEORY
Day Date: Tuesday, 25-04-2017 Max. Marks: 70
Time: 10.30 AM to 01.00 PM
N.B. Attempt five questions
Q. No. and Q. No.(2) are compulsory.
Attempt any three from Q. No.(3) to Q. No.(7)
Figures to the right indicate full mark
Q.1 Choose the correct alternatives: 05
Exponential distribution with location parameter and scale
parameter 1 is a member of
One parameter exponential family Power series family
Pitman family None of these
If X1, X2 is a random sample from Poisson then moment
estimator of is
x1 x2
2
Posterior distribution is the
Joint distribution of x and
Conditional distribution of given x.
Conditional distribution of x given
None of these
Let x be distributed as U
then sufficient statistic for is
Based on random sample of size n from N distribution
(both unknown), MLE of is
Σ
Σ
Σ
Σ
Q.1 Fill in the blanks
05
Based on random sample of size n from N an unbiased
estimator of is
Cramer-Rao lower bound is a special case of
Page 2 of 2
A prior distribution which does not contain information about is
called
Completeness implies
A minimal sufficient statistic is a function of
Q.1. State whether following statements are true or false
04
In Bayesian estimation parameter is a random variable.
A family of discrete uniform distributions is complete.
A statistic T and is independent of ancillary statistic is complete.
Every unbiased estimator is sufficient statistic.
Q.2 State
Basu's theorem
ii) Rao- Blackwell theorem.
Lehmann Schefe theorem
06
Write short notes on the following
Pitman family of distributions
ii) Fisher information matrix
08
Q.3 Define one parameter exponential family of distributions. Obtain
minimal sufficient statistic for this family.
07
Let x be a Poisson random variable show that distribution of x is
complete.
07
Q.4 Describe the method of scoring for obtaining maximum likelihood
estimate of a parameter function.
07
Let X1, X2,….. Xn be a random sample from N distribution
0 obtain MLE of
07
Q.5 State and prove Cramer-Rao inequality stating the regularity
conditions
07
Let X1, X2,….,Xn be a random sample from obtain fisher
information matrix
07
Q.6 Define conjugate prior and non informative prior. Illustrate with one
example for each of them.
07
Let X1, X2,….,Xn is a random simple from B distribution and
prior distribution of is Assuming squared error loss function,
find Bayes estimator of .
07
Q.7 Define power series family. Give any two examples of distributions
that are members of power series family.
07
Let x Geometric distribution obtain UMVUE of P based on a
random sample of size n.
ESTIMATION THEORY
Day Date: Tuesday, 25-04-2017 Max. Marks: 70
Time: 10.30 AM to 01.00 PM
N.B. Attempt five questions
Q. No. and Q. No.(2) are compulsory.
Attempt any three from Q. No.(3) to Q. No.(7)
Figures to the right indicate full mark
Q.1 Choose the correct alternatives: 05
Exponential distribution with location parameter and scale
parameter 1 is a member of
One parameter exponential family Power series family
Pitman family None of these
If X1, X2 is a random sample from Poisson then moment
estimator of is
x1 x2
2
Posterior distribution is the
Joint distribution of x and
Conditional distribution of given x.
Conditional distribution of x given
None of these
Let x be distributed as U
then sufficient statistic for is
Based on random sample of size n from N distribution
(both unknown), MLE of is
Σ
Σ
Σ
Σ
Q.1 Fill in the blanks
05
Based on random sample of size n from N an unbiased
estimator of is
Cramer-Rao lower bound is a special case of
Page 2 of 2
A prior distribution which does not contain information about is
called
Completeness implies
A minimal sufficient statistic is a function of
Q.1. State whether following statements are true or false
04
In Bayesian estimation parameter is a random variable.
A family of discrete uniform distributions is complete.
A statistic T and is independent of ancillary statistic is complete.
Every unbiased estimator is sufficient statistic.
Q.2 State
Basu's theorem
ii) Rao- Blackwell theorem.
Lehmann Schefe theorem
06
Write short notes on the following
Pitman family of distributions
ii) Fisher information matrix
08
Q.3 Define one parameter exponential family of distributions. Obtain
minimal sufficient statistic for this family.
07
Let x be a Poisson random variable show that distribution of x is
complete.
07
Q.4 Describe the method of scoring for obtaining maximum likelihood
estimate of a parameter function.
07
Let X1, X2,….. Xn be a random sample from N distribution
0 obtain MLE of
07
Q.5 State and prove Cramer-Rao inequality stating the regularity
conditions
07
Let X1, X2,….,Xn be a random sample from obtain fisher
information matrix
07
Q.6 Define conjugate prior and non informative prior. Illustrate with one
example for each of them.
07
Let X1, X2,….,Xn is a random simple from B distribution and
prior distribution of is Assuming squared error loss function,
find Bayes estimator of .
07
Q.7 Define power series family. Give any two examples of distributions
that are members of power series family.
07
Let x Geometric distribution obtain UMVUE of P based on a
random sample of size n.
Other Question Papers
Subjects
- asymptotic inference
- clinical trials
- discrete data analysis
- distribution theory
- estimation theory
- industrial statistics
- linear algebra
- linear models
- multivariate analysis
- optimization techniques
- planning and analysis of industrial experiments
- probability theory
- real analysis
- regression analysis
- reliability and survival analysis
- sampling theory
- statistical computing
- statistical methods (oet)
- stochastic processes
- theory of testing of hypotheses
- time series analysis