Exam Details
Subject | machine learning | |
Paper | ||
Exam / Course | b.arch | |
Department | ||
Organization | Visvesvaraya Technological University | |
Position | ||
Exam Date | April, 2018 | |
City, State | karnataka, belagavi |
Question Paper
1 P a g e
Questions Bank
Subject Name: Machine Learning
Subject Code: 15CS73
Sem: VII
Module Questions.
1. De4fine the following terms:
a. Learning
b. LMS weight update rule
c. Version Space
d. Consistent Hypothesis
e. General Boundary
f. Specific Boundary
g. Concept
2. What are the important objectives of machine learning?
3. Explain find algorithm with given example. Give its application.
Table 1
Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change Yes
4. What do you mean by a well -posed learning problem? Explain the important features
that are required to well -define a learning problem.
5. Explain the inductive biased hypothesis space and unbiased learner
6. What are the basic design issues and approaches to machine learning?
7. How is Candidate Elimination algorithm different from Find-S Algorithm
8. How do you design a checkers learning problem
9. Explain the various stages involved in designing a learning system
2 P a g e
10. Trace the Candidate Elimination Algorithm for the hypothesis space given the
sequence of training examples from Table 1.
Cold, High, High,
11. Differentiate between Training data and Testing Data
12. Differentiate between Supervised, Unsupervised and Reinforcement Learning
13. What are the issues in Machine Learning
14. Explain the List Then Eliminate Algorithm with an example
15. What is the difference between Find-S and Candidate Elimination Algorithm
16. Explain the concept of Inductive Bias
17. With a neat diagram, explain how you can model inductive systems by equivalent
deductive systems
18. What do you mean by Concept Learning?
Module Questions.
1. Give decision trees to represent the following boolean functions:
A
A V
A XOR B
v
2. Consider the following set of training examples:
Instance Classification a1 a2
1 T T
2 T T
3 T F
4 F F
5 F T
6 F T
What is the entropy of this collection of training examples with respect to the
target function classification?
What is the information gain of a2 relative to these training examples?
3. NASA wants to be able to discriminate between Martians and Humans based on
the following characteristics: Green Legs Height Smelly
Our available training data is as follows:
3 P a g e
Species Green Legs Height Smelly
1 M N 3 S Y
2 M Y 2 T N
3 M Y 3 T N
4 M N 2 S Y
5 M Y 3 T N
6 H N 2 T Y
7 H N 2 S N
8 H N 2 T N
9 H Y 2 S N
10 H N 2 T Y
Greedily learn a decision tree using the ID3 algorithm and draw the tree.
Write the learned concept for Martian as a set of conjunctive rules (e.g., if
(green=Y and legs=2 and height=T and smelly=N), then Martian; else if ... then
Martian;...; else Human).
The solution of part above uses up to 4 attributes in each conjunction. Find a set of
conjunctive rules using only 2 attributes per conjunction that still results in zero error in the
training set. Can this simpler hypothesis be represented by a decision tree of depth Justify.
4. Discuss Entropy in ID3 algorithm with an example
5. Compare Entropy and Information Gain in ID3 with an example.
6. Describe hypothesis Space search in ID3 and contrast it with Candidate-Elimination
algorithm.
7. Relate Inductive bias with respect to Decision tree learning.
8. Illustrate Occam's razor and relate the importance of Occam's razor with respect to
ID3 algorithm.
9. List the issues in Decision Tree Learning. Interpret the algorithm with respect to
Overfitting the data.
10. Discuss the effect of reduced Error pruning in decision tree algorithm.
11. What type of problems are best suited for decision tree learning
4 P a g e
12. Write the steps of ID3Algorithm
13. What are the capabilities and limitations of ID3
14. Define Preference Bias Restriction Bias
15. Explain the various issues in Decision tree Learning
16. Describe Reduced Error Pruning
17. What are the alternative measures for selecting attributes
18. What is Rule Post Pruning
Module Questions.
What is Artificial Neural Network?
What are the type of problems in which Artificial Neural Network can be applied.
Explain the concept of a Perceptron with a neat diagram.
Discuss the Perceptron training rule.
Under what conditions the perceptron rule fails and it becomes necessary to apply the
delta rule
What do you mean by Gradient Descent?
Derive the Gradient Descent Rule.
What are the conditions in which Gradient Descent is applied.
What are the difficulties in applying Gradient Descent.
10)Differentiate between Gradient Descent and Stochastic Gradient Descent
11)Define Delta Rule.
12)Derive the Backpropagation rule considering the training rule for Output Unit weights
and Training Rule for Hidden Unit weights
13)Write the algorithm for Back propagation.
14) Explain how to learn Multilayer Networks using Gradient Descent Algorithm.
15)What is Squashing Function?
Module Questions.
Explain the concept of Bayes theorem with an example.
Explain Bayesian belief network and conditional independence with example.
What are Bayesian Belief nets? Where are they used?
Explain Brute force MAP hypothesis learner? What is minimum description length
principle
5 P a g e
Explain the k-Means Algorithm with an example.
How do you classify text using Bayes Theorem
Define Prior Probability Conditional Probability Posterior Probability
Explain Brute force Bayes Concept Learning
Explain the concept of EM Algorithm.
10)What is conditional Independence?
11) Explain Naïve Bayes Classifier with an Example.
12)Describe the concept of MDL.
13)Who are Consistent Learners.
14)Discuss Maximum Likelihood and Least Square Error Hypothesis.
15)Describe Maximum Likelihood Hypothesis for predicting probabilities.
16) Explain the Gradient Search to Maximize Likelihood in a neural Net.
Module Questions.
1. What is Reinforcement Learning?
2. Explain the Q function and Q Learning Algorithm.
3. Describe K-nearest Neighbour learning Algorithm for continues valued target function.
4. Discuss the major drawbacks of K-nearest Neighbour learning Algorithm and how it can
be corrected
5. Define the following terms with respect to K Nearest Neighbour Learning
Regression ii) Residual iii) Kernel Function.
6.Explain Q learning algorithm assuming deterministic rewards andactions?
7.Explain the K nearest neighbour algorithm for approximating a discrete valued
functionf Hn→ V with pseudo code
8. Explain Locally Weighted Linear Regression.
9.Explain CADET System using Case based reasoning.
10. Explain the two key difficulties that arise while estimating the Accuracy of Hypothesis.
11.Define the following terms
a. Sample error b. True error c. Random Variable
d. Expected value e. Variance f. standard Deviation
12. Explain Binomial Distribution with an example.
13. Explain Normal or Gaussian distribution with an example.
6 P a g e
14.Explain the Central Limit Theorem with an example.
15. Write the Procedure for estimating the difference in error between two learning methods.
Approximate confidence intervals for this estimate
Questions Bank
Subject Name: Machine Learning
Subject Code: 15CS73
Sem: VII
Module Questions.
1. De4fine the following terms:
a. Learning
b. LMS weight update rule
c. Version Space
d. Consistent Hypothesis
e. General Boundary
f. Specific Boundary
g. Concept
2. What are the important objectives of machine learning?
3. Explain find algorithm with given example. Give its application.
Table 1
Example Sky AirTemp Humidity Wind Water Forecast EnjoySport
1 Sunny Warm Normal Strong Warm Same Yes
2 Sunny Warm High Strong Warm Same Yes
3 Rainy Cold High Strong Warm Change No
4 Sunny Warm High Strong Cool Change Yes
4. What do you mean by a well -posed learning problem? Explain the important features
that are required to well -define a learning problem.
5. Explain the inductive biased hypothesis space and unbiased learner
6. What are the basic design issues and approaches to machine learning?
7. How is Candidate Elimination algorithm different from Find-S Algorithm
8. How do you design a checkers learning problem
9. Explain the various stages involved in designing a learning system
2 P a g e
10. Trace the Candidate Elimination Algorithm for the hypothesis space given the
sequence of training examples from Table 1.
Cold, High, High,
11. Differentiate between Training data and Testing Data
12. Differentiate between Supervised, Unsupervised and Reinforcement Learning
13. What are the issues in Machine Learning
14. Explain the List Then Eliminate Algorithm with an example
15. What is the difference between Find-S and Candidate Elimination Algorithm
16. Explain the concept of Inductive Bias
17. With a neat diagram, explain how you can model inductive systems by equivalent
deductive systems
18. What do you mean by Concept Learning?
Module Questions.
1. Give decision trees to represent the following boolean functions:
A
A V
A XOR B
v
2. Consider the following set of training examples:
Instance Classification a1 a2
1 T T
2 T T
3 T F
4 F F
5 F T
6 F T
What is the entropy of this collection of training examples with respect to the
target function classification?
What is the information gain of a2 relative to these training examples?
3. NASA wants to be able to discriminate between Martians and Humans based on
the following characteristics: Green Legs Height Smelly
Our available training data is as follows:
3 P a g e
Species Green Legs Height Smelly
1 M N 3 S Y
2 M Y 2 T N
3 M Y 3 T N
4 M N 2 S Y
5 M Y 3 T N
6 H N 2 T Y
7 H N 2 S N
8 H N 2 T N
9 H Y 2 S N
10 H N 2 T Y
Greedily learn a decision tree using the ID3 algorithm and draw the tree.
Write the learned concept for Martian as a set of conjunctive rules (e.g., if
(green=Y and legs=2 and height=T and smelly=N), then Martian; else if ... then
Martian;...; else Human).
The solution of part above uses up to 4 attributes in each conjunction. Find a set of
conjunctive rules using only 2 attributes per conjunction that still results in zero error in the
training set. Can this simpler hypothesis be represented by a decision tree of depth Justify.
4. Discuss Entropy in ID3 algorithm with an example
5. Compare Entropy and Information Gain in ID3 with an example.
6. Describe hypothesis Space search in ID3 and contrast it with Candidate-Elimination
algorithm.
7. Relate Inductive bias with respect to Decision tree learning.
8. Illustrate Occam's razor and relate the importance of Occam's razor with respect to
ID3 algorithm.
9. List the issues in Decision Tree Learning. Interpret the algorithm with respect to
Overfitting the data.
10. Discuss the effect of reduced Error pruning in decision tree algorithm.
11. What type of problems are best suited for decision tree learning
4 P a g e
12. Write the steps of ID3Algorithm
13. What are the capabilities and limitations of ID3
14. Define Preference Bias Restriction Bias
15. Explain the various issues in Decision tree Learning
16. Describe Reduced Error Pruning
17. What are the alternative measures for selecting attributes
18. What is Rule Post Pruning
Module Questions.
What is Artificial Neural Network?
What are the type of problems in which Artificial Neural Network can be applied.
Explain the concept of a Perceptron with a neat diagram.
Discuss the Perceptron training rule.
Under what conditions the perceptron rule fails and it becomes necessary to apply the
delta rule
What do you mean by Gradient Descent?
Derive the Gradient Descent Rule.
What are the conditions in which Gradient Descent is applied.
What are the difficulties in applying Gradient Descent.
10)Differentiate between Gradient Descent and Stochastic Gradient Descent
11)Define Delta Rule.
12)Derive the Backpropagation rule considering the training rule for Output Unit weights
and Training Rule for Hidden Unit weights
13)Write the algorithm for Back propagation.
14) Explain how to learn Multilayer Networks using Gradient Descent Algorithm.
15)What is Squashing Function?
Module Questions.
Explain the concept of Bayes theorem with an example.
Explain Bayesian belief network and conditional independence with example.
What are Bayesian Belief nets? Where are they used?
Explain Brute force MAP hypothesis learner? What is minimum description length
principle
5 P a g e
Explain the k-Means Algorithm with an example.
How do you classify text using Bayes Theorem
Define Prior Probability Conditional Probability Posterior Probability
Explain Brute force Bayes Concept Learning
Explain the concept of EM Algorithm.
10)What is conditional Independence?
11) Explain Naïve Bayes Classifier with an Example.
12)Describe the concept of MDL.
13)Who are Consistent Learners.
14)Discuss Maximum Likelihood and Least Square Error Hypothesis.
15)Describe Maximum Likelihood Hypothesis for predicting probabilities.
16) Explain the Gradient Search to Maximize Likelihood in a neural Net.
Module Questions.
1. What is Reinforcement Learning?
2. Explain the Q function and Q Learning Algorithm.
3. Describe K-nearest Neighbour learning Algorithm for continues valued target function.
4. Discuss the major drawbacks of K-nearest Neighbour learning Algorithm and how it can
be corrected
5. Define the following terms with respect to K Nearest Neighbour Learning
Regression ii) Residual iii) Kernel Function.
6.Explain Q learning algorithm assuming deterministic rewards andactions?
7.Explain the K nearest neighbour algorithm for approximating a discrete valued
functionf Hn→ V with pseudo code
8. Explain Locally Weighted Linear Regression.
9.Explain CADET System using Case based reasoning.
10. Explain the two key difficulties that arise while estimating the Accuracy of Hypothesis.
11.Define the following terms
a. Sample error b. True error c. Random Variable
d. Expected value e. Variance f. standard Deviation
12. Explain Binomial Distribution with an example.
13. Explain Normal or Gaussian distribution with an example.
6 P a g e
14.Explain the Central Limit Theorem with an example.
15. Write the Procedure for estimating the difference in error between two learning methods.
Approximate confidence intervals for this estimate
Other Question Papers
Subjects
- alternative building materials
- b.e.
- biomedical equipments
- biomedical instrumentation
- building services - iv
- building services – ii
- building services – iv
- building structures-i
- clinical instrumentation - ii
- clinical instrumentation – i
- construction management and entrepreneurship
- data structure u sing c+ +
- design of machine elements
- design of steel structural elements
- digital communication
- dynamics of machinery
- electrical and electronic measurement
- electrical estimation and costing
- energy and environment
- environmental protection and management
- finite element method of analysis
- fundamentals of signals and dsp
- highway engineering (15cv63)
- history of architecture
- history of architecture -v
- laser physics and non – linear optics
- machine learning
- management and economics
- management and entrepreneurship
- management and entrepreneurship development
- management and entrepreneurship development.
- materials and methods in building construction
- materials and methods in building construction - vii
- materials and methods in building construction – v
- matrix method of structural analysis
- microcontroller
- non traditional machining
- numerical methods and applications (15cv663)
- operating systems
- optimization techniques
- power electronics
- power system analysis 2
- power system protection
- process control systems
- professional practice i
- signals and systems
- sociology and building economics
- software application lab
- solid waste management
- special electrical machines
- theory of elasticity
- turbomachines
- urban design
- virtual bio-instrumentation
- vlsi design
- water resources management
- water supply and treatment engineering