KEMBAR78
Practical Implementation On Baye's Decision Rule | PDF | Covariance Matrix | Mean
0% found this document useful (0 votes)
153 views21 pages

Practical Implementation On Baye's Decision Rule

This document provides instructions for implementing Bayes' decision rule to classify pixels in satellite images of Kolkata as river or non-river. It involves taking training samples from the images, calculating statistics like the mean and covariance, applying the multivariate normal distribution, and using Bayes' rule to classify each test pixel as river or non-river. The output will be binary images showing the classification for different prior probabilities of the classes.

Uploaded by

Akankhya Behera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
153 views21 pages

Practical Implementation On Baye's Decision Rule

This document provides instructions for implementing Bayes' decision rule to classify pixels in satellite images of Kolkata as river or non-river. It involves taking training samples from the images, calculating statistics like the mean and covariance, applying the multivariate normal distribution, and using Bayes' rule to classify each test pixel as river or non-river. The output will be binary images showing the classification for different prior probabilities of the classes.

Uploaded by

Akankhya Behera
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Practical Implementation on

Baye’s Decision Rule


How to Implement
• Four satellite Images of Kolkata (Rband, Gband, Bband and Iband) are given to
you with equal image size (512 * 512).
• The feature vector dimension is 4
• Each pixel location we have four values.
• Two Classes are given (River and NonRiver)
• Take 50 sample points (Pixel location’s corresponding pixel values) from river
class for training for each band
• Take 100 sample points (Pixel location’s corresponding pixel values) from non
river class for training for each band.
• Take (512 * 512) sample points (Pixel location’s corresponding pixel values) for
testing for each band.
• Apply baye’s decision rule to classify all the test sample either in river or nonriver
class denoting 0 and 255 at corresponding pixel locations.
• Show the result in image form with black and white image (either 0 and 255)
How to choose sample points
Examples on Satellite images of
Kolkata: Given Input Images

R Band
Image
(512 * 512)
G Band
Image
(512 * 512)
B Band
Image
(512 * 512)
I Band Image
(512 * 512)
Output Image (River Class and Non River Class

Bayes Rule: P1 = 0.3 and P2 = 0.7


Bayes Rule: P1 = 0.7 and P2 = 0.3
Bayes Rule: P1 = 0.5 and P2 = 0.5
Implementation process of Density
Function
• Step 1: Calculate Mean of River Class : T1 = [Mean1; Mean2; Mean3 ;Mean4];
Mean1 = mean of Rband image for 50 sample points
Mean2 = mean of Gband Image for 50 sample points
Mean3 = mean of Bband image for 50 sample points
Mean4 = mean of Iband image for 50 sample points

• Step 2: Calculate Mean of NonRiver Class : T2 = [Mean1; Mean2; Mean3; Mean4];


Mean1 = mean of Rband image for 100 sample points
Mean2 = mean of Gband Image for 100 sample points
Mean3 = mean of Bband image for 100 sample points
Mean4 = mean of Iband image for 100 sample points

• Step 3: Calculate the Covariance Matrix for River Class for 50 samples which is 4 * 4
dimensions. Basically (X – T1) deviation and (Y – T1) deviation and multiply it and summing up
where X and Y represents all the sample points considered for training ( R, G, B and I band
image) we will get 2^4 = 16 values in the covariance matrix for possible combinations of 4 band
images. We are doing the deviation of sample points from the mean vector.
(Apply covariance matrix calculation formula)
• Step 4: Calculate the Covariance Matrix for Non River Class for 100 samples which is 4 * 4
dimensions also by applying same process explained in step 3.

• Step 5: Take whole image for test data where : test_data= [Rband_img(i,j) Gband_img(i,j)
Bband_img(i,j) Iband_img(i,j)]; i = 1 to 512; and j = 1 to 512;

• step 6: The dimension of test data is (4 * (512 * 512));

• Step 7: For each pixel location of test image Run the loop from i = 1 to (512*512) Do

• Step 8: For river class calculate (test_data – T1) deviation and (test_data – T1)T Then
Multiply it :
River_class = (Test_data – T1)T * Inverse (Covariance_matrix_Riverclass) *(Test_data – T1)

• Step 9: For Non_river class calculate (test_data – T2) deviation and (test_data – T2)T Then
Multiply it :
Nonriver class = (Test_data – T2)T * Inverse (Covariance_matrix_NonRiverclass) *(Test_data –
T2)
• Step 10: Calculate density function p1 for river class where P1 = 0.3 given
p1 = (-0.5) * 1/sqrt( Determinant of Covariance_matrix_Riverclass) * exp(River_class);
(Here we apply multivariate Normal Distrubution)

• Step 11: Calculate density function p2 for nonriver class where P2 = 0.7 given
p2 = (-0.5) * 1/sqrt( Determinant of Covariance_matrix_nonRiverclass) * exp(NonRiver_class);

• Step 12: For each pixel location of test image apply baye’s rule (P1 * p1) >= (P2 * p2) then
Out_image(i) = 255 (River class)
Else
Out_image(i) = 0; (Nonriver class)

• Step 13 : Goto step 7;

• Step 14: Show the three output image Image using imshow function for three cases:
Case 1 : River class (Prior Prob: ) = 0.3 , Nonriver class(Prior Prob) = 0.7
Case 2 : River class (Prior Prob: ) = 0.7 , Nonriver class(Prior Prob) = 0.3
Case 3 : River class (Prior Prob: ) = 0.5 , Nonriver class(Prior Prob) = 0.5
Multivariate Normal Distribution
Concept of Covariance

Relationship of x and y on the basis of sample points


Positive Here R denotes the
Relationship relationship
Case 1 between two
variables on the
basis of sample
points:

Negative Equally
Relationship Distributed
Relationship
Case 2 Case 3
Probability Density Function
Normal Distribution

Stat Book: C. R. Rao: Linear Statistical Inference and its


Applications

You might also like