KEMBAR78
Assignment 1 | PDF
0% found this document useful (0 votes)
18 views2 pages

Assignment 1

The document outlines an experiment for implementing the Perceptron algorithm to find weights for a Linear Discriminant function using data from a 'train.txt' file. It involves plotting sample points, generating high-dimensional data, applying the Perceptron algorithm with varying initial weights and learning rates, and creating a table and bar chart to visualize results. Additionally, it includes questions regarding the necessity of high-dimensional sample points and the number of updates for convergence in different scenarios.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views2 pages

Assignment 1

The document outlines an experiment for implementing the Perceptron algorithm to find weights for a Linear Discriminant function using data from a 'train.txt' file. It involves plotting sample points, generating high-dimensional data, applying the Perceptron algorithm with varying initial weights and learning rates, and creating a table and bar chart to visualize results. Additionally, it includes questions regarding the necessity of high-dimensional sample points and the number of updates for convergence in different scenarios.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

CSE4214 Pattern Recognition Lab

Experiment No 2
“Implementing the Perceptron algorithm for finding the weights of
a Linear Discriminant function.”
Problem Description:

1. Take input from “train.txt” file. Plot all sample points from both classes, but samples
from the same class should have the same color and marker. Observe if these two
classes can be separated with a linear boundary.
2. Consider the case of a second order polynomial discriminant function. Generate the
high dimensional sample points y, as discussed in the class. We shall use the following
formula:

Also, normalize any one of the two classes.


3. Use Perceptron Algorithm (both one at a time and many at a time) for finding the weight-
coefficients of the discriminant function (i.e., values of w) boundary for your linear
classifier in task 2.
Here α is the learning rate and 0 < α ≤ 1.

4. Three initial weights have to be used (all one, all zero, randomly initialized with seed
fixed). For all of these three cases vary the learning rate between 0.1 and 1 with step size
0.1. Create a table which should contain your learning rate, number of iterations for one
at a time and batch Perceptron for all of the three initial weights. You also have to create
a bar chart visualizing your table data.
Also, in your report, address these following questions:
a. In task 2, why do we need to take the sample points to a high dimension?
b. In each of the three initial weight cases and for each learning rate, how many updates
does the algorithm take before converging?
Marks Distribution:
Task 1 2
Task 2 2
Task 3 3
Task 4 3

Sample Output (Initial Weight Vector All One):


Alpha ( Learning Rate ) One at a Time Many at a Time
0.1 6 102
0.2 92 104
0.3 104 91
0.4 106 116

You might also like