KEMBAR78
Apple Vs Orange | PDF | Accuracy And Precision | Information Technology Management
0% found this document useful (0 votes)
7 views24 pages

Apple Vs Orange

a project on hand digit classification

Uploaded by

diyap7640
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views24 pages

Apple Vs Orange

a project on hand digit classification

Uploaded by

diyap7640
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

1.

APPLE VS ORANGE
LAB - 1
OVERVIEW
Objective:The goal is to classify images or data points of apples and
oranges using the K-NN algorithm, determining which fruit a given
sample belongs to based on its features.

IMPORTING THE LIBRARIES


import numpy as np
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.neighbors import KNeighborsClassifier
from sklearn.metrics import accuracy_score,
classification_report,confusion_matrix

DATASET LOADING
df = pd.read_csv(r"C:\Users\diyap\OneDrive\Documents\
apples_and_oranges.csv")

df.head()

Weight Size Class


0 69 4.39 orange
1 69 4.21 orange
2 65 4.09 orange
3 72 5.85 apple
4 67 4.70 orange

df.tail()

Weight Size Class


35 69 4.11 orange
36 69 4.76 orange
37 74 5.48 apple
38 70 5.59 apple
39 73 5.03 apple

df.shape
(40, 3)

DATAPOINTS VISUALIZATION
import numpy as np
new_point = np.array([[70, 5.5]])

import matplotlib.pyplot as plt


import pandas as pd
new_point = [[70, 5.5]]
plt.figure(figsize=(9, 6))
for label, color in zip(["orange", "apple"], ["orange", "red"]):
plt.scatter(df[df["Class"] == label]["Weight"], df[df["Class"] ==
label]["Size"],
label=label, color=color, edgecolor="k")
plt.scatter(new_point[0][0], new_point[0][1], color="blue", label="New
Point", edgecolor="k", s=100)
plt.text(new_point[0][0], new_point[0][1] - 0.5, '[[70, 5.5]]',
fontsize=10, ha='center')
plt.title("KNN Classification", fontsize=16)
plt.xlabel("Weight", fontsize=12)
plt.ylabel("Size", fontsize=12)
plt.xlim(min(df["Weight"]) - 1, max(df["Weight"]) + 1)
plt.ylim(min(df["Size"]) - 1, max(df["Size"]) + 1)
plt.legend()
plt.show()
SPLITTING THE DATA INTO TRAINING AND
TESTING
X = df.drop(columns=['Class'])
y = df['Class']
X_train, X_test, y_train, y_test = train_test_split(X, y,
test_size=0.2, random_state=42)
print(f"Training Dataset: {X_train.shape}")
print(f"Testing Dataset: {X_test.shape}")

Training Dataset: (32, 2)


Testing Dataset: (8, 2)
APPLYING THE KNN MODEL
1] THE K=3 VALUE
Knn = KNeighborsClassifier(n_neighbors=3)
Knn.fit(X_train, y_train)
y_pred_test = Knn.predict(X_test)
accuracy_test = accuracy_score(y_test, y_pred_test)
print(f"Accuracy : {accuracy_test:.2f}")

Accuracy : 1.00

print(classification_report(y_test, y_pred_test))

precision recall f1-score support

apple 1.00 1.00 1.00 3


orange 1.00 1.00 1.00 5

accuracy 1.00 8
macro avg 1.00 1.00 1.00 8
weighted avg 1.00 1.00 1.00 8

CONFUSION MATRIX (1)


import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred_test)
print(cm)
sns.heatmap(cm, annot=True, fmt='d', cmap='mako',
xticklabels=['Predicted 0', 'Predicted 1'],
yticklabels=['Actual 0', 'Actual 1'])
plt.xlabel('Predicted')
plt.ylabel('Actual')
plt.title('Confusion Matrix')
plt.show()

[[3 0]
[0 5]]
2] THE K=5 VALUE
Knn = KNeighborsClassifier(n_neighbors=5)
Knn.fit(X_train, y_train)
y_pred_test2 = Knn.predict(X_test)
accuracy_test2 = accuracy_score(y_test, y_pred_test2)
print(f"Accuracy: {accuracy_test2:.2f}")

Accuracy: 1.00

print(classification_report(y_test, y_pred_test2))

precision recall f1-score support

apple 1.00 1.00 1.00 3


orange 1.00 1.00 1.00 5

accuracy 1.00 8
macro avg 1.00 1.00 1.00 8
weighted avg 1.00 1.00 1.00 8
CONFUSION MATRIX (2)
import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred_test2)
print(cm)
sns.heatmap(cm, annot=True, fmt='d', cmap='magma',
xticklabels=['Predicted 0', 'Predicted 1'],
yticklabels=['Actual 0', 'Actual 1'])
plt.xlabel('Predicted')
plt.ylabel('Actual')
plt.title('Confusion Matrix')
plt.show()

[[3 0]
[0 5]]
3] THE K=7 VALUE
Knn = KNeighborsClassifier(n_neighbors=7)
Knn.fit(X_train, y_train)
y_pred_test3 = Knn.predict(X_test)
accuracy_test3 = accuracy_score(y_test, y_pred_test3)
print(f"Accuracy: {accuracy_test3:.2f}")

Accuracy: 1.00

print(classification_report(y_test, y_pred_test3))

precision recall f1-score support

apple 1.00 1.00 1.00 3


orange 1.00 1.00 1.00 5

accuracy 1.00 8
macro avg 1.00 1.00 1.00 8
weighted avg 1.00 1.00 1.00 8

CONFUSION MATRIX (3)


import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred_test2)
print(cm)
sns.heatmap(cm, annot=True, fmt='d', cmap='crest',
xticklabels=['Predicted 0', 'Predicted 1'],
yticklabels=['Actual 0', 'Actual 1'])
plt.xlabel('Predicted')
plt.ylabel('Actual')
plt.title('Confusion Matrix')
plt.show()

[[3 0]
[0 5]]
Conclusion
1. Imported Necessary Libraries: Essential libraries like NumPy and Scikit-learn were
imported.
2. Loaded the Dataset: The initial dataset of apples and oranges was loaded.
3. Synthetic Data Generation: The dataset size was increased to 50 samples through
synthetic data generation.
4. Data Splitting: The dataset was split into training (80%), validation (10%), and testing
(10%) sets.
5. Implemented K-NN: The K-NN model was built and trained using the training set.
6. Evaluated Model Performance: The model was evaluated on the validation set,
recording accuracy and other metrics.
7. Generated Confusion Matrices: Confusion matrices were created for different K values
to visualize classification results.
*******************************
********************

2. HANDWRITTEN DIGIT CLASSIFICATION


LAB - 2
OVERVIEW
Objective:The goal is to classify handwritten digits from the MNIST
dataset using the K-NN algorithm, specifically modifying the distance
metric to evaluate its impact on classification performance.

IMPORTING THE LIBRARIES


import numpy as np
import tensorflow as tf
from tensorflow.keras import layers, models
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.metrics import confusion_matrix

DATASET LOADING
(x_train, y_train), (x_test, y_test) =
tf.keras.datasets.mnist.load_data()
print("Train shape:", x_train.shape, "Test shape:", x_test.shape)

Train shape: (60000, 28, 28) Test shape: (10000, 28, 28)

x_train[0].shape

(28, 28)

x_train[0]

array([[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
3,
18, 18, 18, 126, 136, 175, 26, 166, 255, 247, 127, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 30, 36, 94, 154,
170,
253, 253, 253, 253, 253, 225, 172, 253, 242, 195, 64, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 49, 238, 253, 253, 253,
253,
253, 253, 253, 253, 251, 93, 82, 82, 56, 39, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 18, 219, 253, 253, 253,
253,
253, 198, 182, 247, 241, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 80, 156, 107, 253,
253,
205, 11, 0, 43, 154, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 14, 1, 154,
253,
90, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 139,
253,
190, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 11,
190,
253, 70, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
35,
241, 225, 160, 108, 1, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
81, 240, 253, 253, 119, 25, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 45, 186, 253, 253, 150, 27, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 16, 93, 252, 253, 187, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 249, 253, 249, 64, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 46, 130, 183, 253, 253, 207, 2, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
39,
148, 229, 253, 253, 253, 250, 182, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 24, 114,
221,
253, 253, 253, 253, 201, 78, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 23, 66, 213, 253,
253,
253, 253, 198, 81, 2, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 18, 171, 219, 253, 253, 253,
253,
195, 80, 9, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 55, 172, 226, 253, 253, 253, 253, 244,
133,
11, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 136, 253, 253, 253, 212, 135, 132, 16,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0],
[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0,
0, 0]], dtype=uint8)

plt.matshow(x_train[0])

<matplotlib.image.AxesImage at 0x1350f487590>
plt.matshow(x_train[5])

<matplotlib.image.AxesImage at 0x1350f55bd50>
plt.matshow(x_train[3])

<matplotlib.image.AxesImage at 0x13514a7db10>
plt.matshow(x_train[2])

<matplotlib.image.AxesImage at 0x13514aef490>
plt.matshow(x_train[1])

<matplotlib.image.AxesImage at 0x13514b5d090>
plt.matshow(x_train[4])

<matplotlib.image.AxesImage at 0x13514bba450>
SCALING THA VALUES
y_train[5]

np.uint8(2)

y_train[:15]

array([5, 0, 4, 1, 9, 2, 1, 3, 1, 4, 3, 5, 3, 6, 1], dtype=uint8)

x_train.shape

(60000, 28, 28)

x_train = x_train / 255


x_test = x_test / 255

x_train[0]

array([[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.01176471, 0.07058824, 0.07058824,
0.07058824, 0.49411765, 0.53333333, 0.68627451, 0.10196078,
0.65098039, 1. , 0.96862745, 0.49803922, 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.11764706, 0.14117647,
0.36862745, 0.60392157, 0.66666667, 0.99215686, 0.99215686,
0.99215686, 0.99215686, 0.99215686, 0.88235294, 0.6745098 ,
0.99215686, 0.94901961, 0.76470588, 0.25098039, 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.19215686, 0.93333333, 0.99215686,
0.99215686, 0.99215686, 0.99215686, 0.99215686, 0.99215686,
0.99215686, 0.99215686, 0.98431373, 0.36470588, 0.32156863,
0.32156863, 0.21960784, 0.15294118, 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.07058824, 0.85882353, 0.99215686,
0.99215686, 0.99215686, 0.99215686, 0.99215686, 0.77647059,
0.71372549, 0.96862745, 0.94509804, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.31372549, 0.61176471,
0.41960784, 0.99215686, 0.99215686, 0.80392157, 0.04313725,
0. , 0.16862745, 0.60392157, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.05490196,
0.00392157, 0.60392157, 0.99215686, 0.35294118, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.54509804, 0.99215686, 0.74509804, 0.00784314,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.04313725, 0.74509804, 0.99215686, 0.2745098 ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.1372549 , 0.94509804, 0.88235294,
0.62745098, 0.42352941, 0.00392157, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.31764706, 0.94117647,
0.99215686, 0.99215686, 0.46666667, 0.09803922, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.17647059,
0.72941176, 0.99215686, 0.99215686, 0.58823529, 0.10588235,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.0627451 , 0.36470588, 0.98823529, 0.99215686, 0.73333333,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.97647059, 0.99215686, 0.97647059,
0.25098039, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.18039216,
0.50980392, 0.71764706, 0.99215686, 0.99215686, 0.81176471,
0.00784314, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.15294118, 0.58039216, 0.89803922,
0.99215686, 0.99215686, 0.99215686, 0.98039216, 0.71372549,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.09411765, 0.44705882, 0.86666667, 0.99215686, 0.99215686,
0.99215686, 0.99215686, 0.78823529, 0.30588235, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.09019608, 0.25882353,
0.83529412, 0.99215686, 0.99215686, 0.99215686, 0.99215686,
0.77647059, 0.31764706, 0.00784314, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0.07058824, 0.67058824, 0.85882353, 0.99215686,
0.99215686, 0.99215686, 0.99215686, 0.76470588, 0.31372549,
0.03529412, 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0.21568627,
0.6745098 , 0.88627451, 0.99215686, 0.99215686, 0.99215686,
0.99215686, 0.95686275, 0.52156863, 0.04313725, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0.53333333,
0.99215686, 0.99215686, 0.99215686, 0.83137255, 0.52941176,
0.51764706, 0.0627451 , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ]])

BUILDING THE NEURAL NETWORK MODEL


import keras
model = keras.Sequential([
keras.layers.Flatten(input_shape=(28, 28)),
keras.layers.Dense(80, activation='relu'),
keras.layers.Dense(80, activation='relu'),
keras.layers.Dense(10, activation='softmax'),
])
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(x_train, y_train, epochs=4)

C:\Users\diyap\AppData\Local\Programs\Python\Python311\Lib\site-
packages\keras\src\layers\reshaping\flatten.py:37: UserWarning: Do not
pass an `input_shape`/`input_dim` argument to a layer. When using
Sequential models, prefer using an `Input(shape)` object as the first
layer in the model instead.
super().__init__(**kwargs)

Epoch 1/4
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 7s 3ms/step - accuracy: 0.8649 - loss:
0.4666
Epoch 2/4
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 10s 3ms/step - accuracy: 0.9648 - loss:
0.1201
Epoch 3/4
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 6s 3ms/step - accuracy: 0.9750 - loss:
0.0820
Epoch 4/4
1875/1875 ━━━━━━━━━━━━━━━━━━━━ 10s 3ms/step - accuracy: 0.9802 - loss:
0.0636
<keras.src.callbacks.history.History at 0x13514ba6610>

EVALUATE THE MODEL


model.evaluate(x_test,y_test)

313/313 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9686 - loss:


0.1069

[0.08965574949979782, 0.9736999869346619]

PREDICTION OF THE RESULT


y_pred = model.predict(x_test)
y_pred = [np.argmax(i) for i in y_pred ]

313/313 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step

y_pred[:5]

[np.int64(7), np.int64(2), np.int64(1), np.int64(0), np.int64(4)]

y_test[:5]

array([7, 2, 1, 0, 4], dtype=uint8)

THE CONFUSION MATRIX


conf_mat = tf.math.confusion_matrix(labels = y_test,predictions =
y_pred)

import matplotlib.pyplot as plt


import seaborn as sns
plt.figure(figsize=(9, 5))
sns.heatmap(conf_mat, annot=True, fmt='d', cmap='mako')
plt.xlabel('Predicted Labels')
plt.ylabel('True Labels')
plt.title('Confusion Matrix Heatmap')
plt.show()
conf_mat

<tf.Tensor: shape=(10, 10), dtype=int32, numpy=


array([[ 963, 1, 0, 0, 3, 0, 3, 4, 3, 3],
[ 0, 1128, 3, 0, 0, 1, 2, 0, 1, 0],
[ 4, 2, 1011, 2, 6, 0, 1, 5, 1, 0],
[ 0, 2, 10, 970, 1, 12, 0, 5, 4, 6],
[ 0, 0, 3, 0, 970, 0, 4, 2, 0, 3],
[ 2, 0, 0, 10, 2, 867, 6, 1, 1, 3],
[ 3, 2, 2, 1, 6, 6, 937, 0, 1, 0],
[ 0, 5, 12, 2, 5, 0, 0, 997, 2, 5],
[ 5, 1, 3, 5, 10, 8, 6, 3, 928, 5],
[ 2, 3, 0, 3, 23, 3, 1, 8, 0, 966]],
dtype=int32)>

You might also like