KEMBAR78
CS481 Assignment3 | PDF | Artificial Intelligence | Intelligence (AI) & Semantics
0% found this document useful (0 votes)
46 views2 pages

CS481 Assignment3

This document contains an assignment for a pattern recognition course consisting of multiple choice questions, essay questions, and problem solving questions. The multiple choice questions test concepts in supervised learning, unsupervised learning, probability, and optimization methods. The essay questions ask students to discuss linear classifiers and optimization of linear classifiers. The problem solving questions provide data on two classes and ask students to derive discriminant functions and decision boundaries.

Uploaded by

Mohamed Wael
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views2 pages

CS481 Assignment3

This document contains an assignment for a pattern recognition course consisting of multiple choice questions, essay questions, and problem solving questions. The multiple choice questions test concepts in supervised learning, unsupervised learning, probability, and optimization methods. The essay questions ask students to discuss linear classifiers and optimization of linear classifiers. The problem solving questions provide data on two classes and ask students to derive discriminant functions and decision boundaries.

Uploaded by

Mohamed Wael
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Misr University for Science and Technology

College of Information Technology

CS481: Pattern Recognition, Fall 22


Assignment 3
1. Multiple Choice Questions
1. Given labeled images of faces, what type of learning is used to build a face recognition
system?
a. Semi-supervised learning c. Reinforcement learning
b. Supervised learning d. Unsupervised learning
2. The problem of finding hidden structure in unlabeled data is called…
a. Semi-unsupervised learning c. Reinforcement learning
b. Supervised learning d. Unsupervised learning
3. The individual outcome in ………. processes is uncertain but there is a regular
distribution that describe the frequency of all the outcomes.
a. deterministic c. natural
b. random d. none of the above
4. The probability of getting two odd numbers when throwing two dice is ……………
a. 1/4 c. 1/36
b. 1/2 d. 1/12
5. The variables x and y are said to be statistically independent if and only if …………
a. p(x,y) = p(x|y)p(y) c. p(x,y) = p(x)p(y)
b. p(x,y) = p(y|x)p(x) d. p(x,y) = p(x|y)p(x)
6. ….… is based on quantifying the tradeoffs between various classification decisions
using probabilities and the costs that accompany such decisions.
a. Kolmogorov decision c. Bayesian decision theorem
theorem d. none of the above
b. Gauss decision theorem
7. To solve the over-fitting problem, we should increase the ……………
a. data samples c. data classes
b. order of the model d. none of the above
8. …………… is the “expected loss associated with a given decision rule”.
a. The overall risk R c. Both (a) and (b)
b. Bayes Risk d. None of the above
9. The discriminant function of a classifier is …………… when the features are
statistically independent.
a. linear c. quadratic
b. non-linear d. all of the above

1
Misr University for Science and Technology
College of Information Technology

10. ……….. is a method for finding the parameters that optimize a function.
a. Gradient Descent c. Both (a) and (b)
b. Bayes Descent d. none of the above
2. Essay Questions:
A. Linear classifiers are simple yet powerful classifier. Discuss linear classifiers and
explain when one can use them.
B. Discuss the main optimization method for finding the optimal parameters for linear
qlassifierrs

3. Problem Solving Questions


In building a dichotomizer, we chose the discriminant functions as 𝑔𝑖 (𝑥) =
𝑃(𝑤𝑖 |𝑥) and the likelihood densities as multivariate normal, i.e., 𝑃(𝑥|𝑤𝑖 ) =
𝑁(𝜇𝑖  𝑖 ) , if 𝑃(𝑤1 ) = 1/3 and 𝑃(𝑤2 ) = 2/3.
i. Find the final form of the discriminant functions if the features are
statistically independent.
ii. Find the boundary decision equation if we have the following data for
the two classes:

3 2
𝜇1 = [ ], 1 = 4I, 𝜇2 = [ ], and 2 = 2I,
4 3
iii. Prove that the boundary decision equation for the two classes is as given
below for the following data: P(w1)=P(w2)=0.5,

 3 1 / 2 0 2 0 
1    1    11   
6    0 2 0 1 / 2

3 2 0 1 / 2 0 
2    2    21   
 2   0 1 / 2
0 2
2
x - 3.514 + 1.125 x - 0.1875 x = 0
2 1 1

You might also like