KEMBAR78
ML - Discriminant Functions | PDF | Statistical Classification | Applied Mathematics
0% found this document useful (0 votes)
22 views17 pages

ML - Discriminant Functions

The document provides an overview of machine learning classification techniques, focusing on discriminant functions for two and multiple classes, and the use of least squares for classification. It discusses the Fisher Linear Discriminant Analysis (LDA) as a method to maximize class separation and its role in dimensionality reduction. Additionally, it includes examples and references for further reading on the subject.

Uploaded by

chirag nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views17 pages

ML - Discriminant Functions

The document provides an overview of machine learning classification techniques, focusing on discriminant functions for two and multiple classes, and the use of least squares for classification. It discusses the Fisher Linear Discriminant Analysis (LDA) as a method to maximize class separation and its role in dimensionality reduction. Additionally, it includes examples and references for further reading on the subject.

Uploaded by

chirag nayak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Machine Learning

CS F464
BITS Pilani Dr. Pranav M. Pawar
Dubai Campus
Contents
• Discriminant Functions
• Two classes
• Multiple classes
• Least square for classification
• Fisher Linear Discriminant (Linear Discriminant)

BITS Pilani, Dubai Campus


Introduction (1)
• Goal of classification
• To take input vector x and to assign it to one of K discrete classes Ck
where k = 1,….,K.
• The input space is divided into decision regions and
its boundary called decision boundaries or decision
surfaces.
• Linear models for classification =>
Decision surfaces are linear
functions of input vector x,
hence are defined using (D-1)
dimensional hyperplane within
the D-dimensional input.

BITS Pilani, Dubai Campus


Introduction (2)
• Linearly separable data sets => If classes can be
separated exactly by linear decision surface.
• The generalised linear model for classification is,

• f(·) is a fixed non-linear function (activation function)


• E.g.

• Decision boundary between classes will be linear function of x.

BITS Pilani, Dubai Campus


Discriminant Function (1)
• Function which takes an input vector x and assigns it to
one of K classes.
• Two class
– 2 class problem,
– Simple linear discriminant
function is,

– Use of threshold for


predicting class. Threshold is
negative of bias.
– Input vector x belongs to class
C1 if y(x) >= 0 and to class C2
otherwise. Corresponding decision
boundary is y(x) = 0.
– Perpendicular distance of x in w
direction is

BITS Pilani, Dubai Campus


Discriminant Function (2)
• Multiple classes

• A linear discriminant between two classes separates with a


hyperplane
• One-versus-the-rest method: build K - 1 classifiers, between Ck
and all others
• One-versus-one method: build K(K - 1)/2 classifiers, between
all pairs

BITS Pilani, Dubai Campus


Discriminant Function (3)
• A solution is to build K linear
function,

• Assign point x to class Ck if


yk(x) > yj(x) for all j not equal to
k.
• Decision boundary between
class Ck and Cj is given by
yk(x) = yj(x) and (D-1)
dimensional hyperplane is
defined as,

• Decision of such a discriminant


is always singly connected and
convex.

BITS Pilani, Dubai Campus


Least squares for classification
• How do we learn the decision boundaries (wk;
wk0)?
• One approach is to use least squares.
• Find W to minimize squared error over all
examples and all components of the label vector:

BITS Pilani, Dubai Campus


Problems with Least Square (1)

BITS Pilani, Dubai Campus


Problems with Least Square (2)

BITS Pilani, Dubai Campus


Fisher Linear Discriminant (Linear Discriminant
Analysis (LDA))
• LDA maximizes the separation between multiple classes.
• LDA seeks a projection that best discriminates the data.
• Goal
– Seeks to find directions along which the classes are best separated
(i.e., increase discriminatory information).
– It takes into consideration the scatter (i.e., variance) within-classes
and between-classes.

– LDA is also used as dimensionality reduction technique as pre-


processing step for ML application.
BITS Pilani, Dubai Campus
Fisher Linear Discriminant (Linear Discriminant
Analysis (LDA))

BITS Pilani, Dubai Campus


Linear Discriminant Analysis (LDA)

BITS Pilani, Dubai Campus


Linear Discriminant Analysis (LDA)

BITS Pilani, Dubai Campus


Example
Find the linear discriminant projection vector and classify the for following
data samples,
Class 1 =>
X1 = (x1,x2) = {(4,2), (2,4), (2,3), (3,6), (4,4)}
Class 2 =>
X2 = (x1,x2) = {(9,10), (6,8), (9,5) ,(8,7), (10,8)}
Solution:

BITS Pilani, Dubai Campus


Sources
• Chapter 4, Christopher M Bishop: Pattern Recognition &
Machine Leaning, 2006 Springer.
• http://vda.univie.ac.at/Teaching/ML/15s/LectureNotes/04_cl
assification.pdf
• Chapter 9, Christopher M. Bhisop, Pattern Recognition &
Machine Learning, Springer, 2006.
• Chapter 6 and 14, Marsland Stephen, Machine Learning –
An Algorithmic Perspective, 2e, CRC Press,2015.
• Chapter 6 and 7, Alpaydin Ethem. Introduction to Machine
Learning, 3e, PHI, 2014.
• http://www.facweb.iitkgp.ac.in/~sudeshna/courses/ml08
• https://www.cse.unr.edu/~bebis/CS479/Lectures
• https://cse.iitkgp.ac.in/~dsamanta/courses/da/resources
• http://www.cvip.louisville.edu/wordpress/wp-
content/uploads/2010/01/LDA-Tutorial-1.pdf
BITS Pilani, Dubai Campus
BITS Pilani
Dubai Campus

Thank You!

You might also like