KEMBAR78
Introduction To Machine Learning (Tamil) - Unit 4 - Week 2 | PDF | Algorithms | Theoretical Computer Science
0% found this document useful (0 votes)
12 views6 pages

Introduction To Machine Learning (Tamil) - Unit 4 - Week 2

Uploaded by

Thangamari D
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views6 pages

Introduction To Machine Learning (Tamil) - Unit 4 - Week 2

Uploaded by

Thangamari D
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

3/21/25, 1:38 PM Introduction to Machine Learning (Tamil) - - Unit 4 - Week 2

buvaneswaricsk@gmail.com 

(https://swayam.gov.in)

(https://swayam.gov.in/nc_details/NPTEL)

NPTEL (https://swayam.gov.in/explorer?ncCode=NPTEL) » Introduction to Machine Learning (Tamil) (course)

Course outline Week 2 : Assignment 2


The due date for submitting this assignment has passed.
How does an NPTEL Due on 2022-02-09, 23:59 IST.
online course work?
()
Assignment submitted on 2022-02-08, 22:04 IST
Week 0 () 1) Which of the following are a binary classification problem? 1 point

Week 1 () Categorize whether a tweet in twitter is violent or nonviolent


Predict whether an upcoming movie will be successful or not
Week 2 () Recognize whether a digit in an image is zero or nine

Binary Classification Given a speech waveform (i.e, audio), identify whether the speaker is a male or female
(unit? Yes, the answer is correct.
unit=22&lesson=47)

https://onlinecourses.nptel.ac.in/noc22_cs58/unit?unit=22&assessment=46 1/6
3/21/25, 1:38 PM Introduction to Machine Learning (Tamil) - - Unit 4 - Week 2

Score: 1
K-Nearest Neighbour
Classification (unit? Accepted Answers:
unit=22&lesson=48) Categorize whether a tweet in twitter is violent or nonviolent
Predict whether an upcoming movie will be successful or not
Distance metric and Recognize whether a digit in an image is zero or nine
Cross-Validation (unit?
Given a speech waveform (i.e, audio), identify whether the speaker is a male or female
unit=22&lesson=49)
2) Consider a problem of classifying whether a given fruit is a Muskmelon (Label:1) or a Mango (Label:0). The following 2 points
Computational
table shows the weight values collected for 5 Muskmelons and 5 Mangoes. Suppose that two K-NN classifiers,namely, C1 and C2
efficiency of KNN (unit?
unit=22&lesson=50)
are used for classification. Assume that k = 1 for classifier C1 and k = 3 for classifier C2 . Further, both the classifiers use the
following distance formula d = |x t − x i | where i = 1, 2, 3, .., 10. Then the classifiers classify the given test point x t = 2.5 as
Introduction to Decision (O 1 , O 2 ), where O 1 is classification by C1 and O 2 is classification by C2
Trees (unit?
unit=22&lesson=51)

Level splitting (unit?


unit=22&lesson=52)

Measure of Impurity
(unit?
unit=22&lesson=53)

Entropy and Information


Gain (unit?
unit=22&lesson=54)

Generative vs
Discriminative models
(unit?
unit=22&lesson=55)

Naive Bayes classifier


(unit?
unit=22&lesson=56)

Conditional
Independence (unit?
unit=22&lesson=57)

Classifying the test


point and summary (1,1)

https://onlinecourses.nptel.ac.in/noc22_cs58/unit?unit=22&assessment=46 2/6
3/21/25, 1:38 PM Introduction to Machine Learning (Tamil) - - Unit 4 - Week 2

(unit? (1,0)
unit=22&lesson=58)
(0,1)
Quiz: Week 2 :
(0,0)
Assignment 2
(assessment? Yes, the answer is correct.
name=46) Score: 2
Accepted Answers:
Week 2 Feedback : (1,1)
Introduction to Machine
Learning (Tamil) (unit? 3) Use decision tree classification algorithm by setting the threshold for the feature x < 2.5 in the root node. What is the
unit=22&lesson=26) information gain after the first split (i.e.,level-1 of the tree)?. Let log2 (0) be defined as zero

Week 3 () 0.42

Yes, the answer is correct.


Week 4 () Score: 2
Accepted Answers:
Download Videos () (Type: Range) 0.36,0.46

2 points
Lecture notes ()
4) Does splitting the D yes node further by setting the feature threshold as x < 1.2 necessary? 1 point
Live Sessions ()
Yes, as there is a significant amount of information gain after splitting
No, as there is no information gain after splitting
No, as there is a significant amount of information gain after splitting
Yes, as there is no information gain after splitting.

Yes, the answer is correct.


Score: 1
Accepted Answers:
No, as there is no information gain after splitting

5) Suppose that we add an extra feature, color x 2 , to the data set as shown in the table below. Which of these two 2 points
features (x 1 , x 2 ) will form the root node of the Decision Tree? and Why? Assume the following thresholds for the features to split the
tree x 1 < 2.5 and x 2 > 0.5

https://onlinecourses.nptel.ac.in/noc22_cs58/unit?unit=22&assessment=46 3/6
3/21/25, 1:38 PM Introduction to Machine Learning (Tamil) - - Unit 4 - Week 2

The feature x 1 becomes the root as the information gain is comparatively higher than x 2

The feature x 2 becomes the root as the information gain is higher than x 1

The feature x 2 can not be the root node as the entropy of children (left and right node of the root node) is zero

The feature x 1 can be the root node as the entropy of only one of the children (left and right node of the root node) is zero

Yes, the answer is correct.


Score: 2
Accepted Answers:
The feature x 2 becomes the root as the information gain is higher than x 1

6) Which of the following statements about K-NN classifier is (are) true 1 point

For each new test point, the classifier computes the distance between the test point and all the training points in the data set

Suppose k = 3. If the test point is exactly same as the first training point in the set, the algorithm returns the class label of the
training point without computing distances for other training points in the entire data set
It can be extended to multiclass classification problem

https://onlinecourses.nptel.ac.in/noc22_cs58/unit?unit=22&assessment=46 4/6
3/21/25, 1:38 PM Introduction to Machine Learning (Tamil) - - Unit 4 - Week 2

Increasing the k value always gives the better predictions at the cost of increased computation

Yes, the answer is correct.


Score: 1
Accepted Answers:
For each new test point, the classifier computes the distance between the test point and all the training points in the data set
It can be extended to multiclass classification problem

7) Which of the following statements about Decision Tree is (are) not true? 1 point

Decision tree is not suitable if all the features x i are continuous(i.e., they can take any real number).
Entropy measures the impurity of nodes in the tree
If all the labels in the training set are of the same class then the entropy is zero and therefore the decision tree contains only
root node
The entropy of value 1 denotes maximum impurity in a node

Yes, the answer is correct.


Score: 1
Accepted Answers:
Decision tree is not suitable if all the features x i are continuous(i.e., they can take any real number).

8) Consider the following statements 1 point

A. Discriminative models learns a probability mapping from feature x to label y.


B. Generative models learns a joint probability distribution of feature x and label y
C. Discriminative models do not learn about a probability distribution from which the features are drawn.

Which of these statements are True?

A and B
Only A
All the statements: A,B and C
B and C
Only B

https://onlinecourses.nptel.ac.in/noc22_cs58/unit?unit=22&assessment=46 5/6
3/21/25, 1:38 PM Introduction to Machine Learning (Tamil) - - Unit 4 - Week 2

No, the answer is incorrect.


Score: 0
Accepted Answers:
All the statements: A,B and C

9) [NAT] Suppose we remove the Naive assumption in Naive Bayes i.e, we assume no conditional independence - for binary
classification to identify whether a text content in a social media platform is violent or non-violent. If we create a dictionary with 8
binary features (corresponding to presence or absence of the word in the tweet), then the number of parameters to be learned is?

513

Yes, the answer is correct.


Score: 1
Accepted Answers:
(Type: Range) 511,513

1 point

https://onlinecourses.nptel.ac.in/noc22_cs58/unit?unit=22&assessment=46 6/6

You might also like