KEMBAR78
Offline Signature Verification Using Deep Learning and Genetic Algorithm | PDF | Deep Learning | Artificial Neural Network
0% found this document useful (0 votes)
31 views17 pages

Offline Signature Verification Using Deep Learning and Genetic Algorithm

paper
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views17 pages

Offline Signature Verification Using Deep Learning and Genetic Algorithm

paper
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

JKAU: Comp. IT. Sci., Vol. 13 No. 1, pp: 68 – 85 (2024 A.D.

)
DOI: 10.4197/Comp.13-1.5

Offline Signature Verification Using Deep learning and


Genetic Algorithm

Abdoulwase M. Obaid Al-Azzani1 and Abdulbaset M. Qaid Musleh1


1Department of Computer Science and Information Technology, Sana’a University

Sana’a, Yemen amalezzani71@gmail.com, aledresi200@yahoo.com

Abstract. the process of verifying signatures has wide-ranging applications in computer systems, including
financial opera- tions, electronic document signing, and user identity verification. This approach has the
advantage of community acceptance and presents a less intrusive alternative than other biological
authentication methods. Deep learning (DL) and Convolutional Neural Networks (CNNs) have emerged as
prominent tools in the field of signature verification, significantly enhancing the accuracy and effectiveness
of these systems by effectively extracting discriminative features from signature images. However,
optimizing the hyperparameters in CNN models remains a challenging task, as it directly affects the
efficiency and accuracy of the models. Currently, the design of CNN architectures relies heavily on manual
adjustments, which can be time consuming and may not yield optimal results. To address this issue, the
proposed method focuses on employing a genetic algorithm to evolve a population of CNN models, thereby
enabling the automatic discovery of the most suitable architecture for offline signature verification. By
leveraging the optimization capabilities of the genetic algorithm, the proposed approach aims to improve
the overall performance and effectiveness of the signature verification model. The effectiveness of the
proposed method was evaluated using multiple datasets, including BHSig260-Bengali, BHSig260-Hindiin,
GPDS, and CEDAR. Through rigorous testing, the approach achieved remarkable discrimination rates with
a False Rejection Rate (FRR) of 2.5%-, False Acceptance Rate (FAR) of 3.2%-, Equal Error Rate (EER) of
2.35%-, and accuracy rate of 97.73%-.
Keywords—Offline Signature Verification, Convolutional Neural Network, Deep Learning, and Genetic
Algorithm.

I. INTRODUCTION verification focuses on capturing dynamic


relevance [5]. Given the continuous information during the writing process,
authorization of financial doc- uments and whereas offline signature verification deals
business transactions through signatures, the with static signature images, posing greater
primary goal of handwriting signature challenges and typically yielding lower
verification systems is to differenti- accuracy compared to its online counterpart
Biometric systems have become essential for [4]. However, offline signature verification
personal authen- tication by employing offers distinct advantages, despite its lower
behavioral or physiological characteristics. In accuracy. It does not require specialized in-
the realm of biometrics, handwritten sig- put devices, making it more accessible and
natures have emerged as widely used tools for applicable to a wider range of scenarios.
secure verification [1, 2]. Signature verifi- Moreover, offline signature verification spans
cation has been extensively researched, with a various domains, thereby expanding its
distinction between two main categories: potential applications and ate between genuine
online and offline [3]. Online signature signatures created by authorized writers and

69
70 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

forged signatures produced by fraudulent simplifies the addition of new individuals to the
individuals [6]. Forgery in the signature system, as the classification is based on a
verification field can be categorized into three single category for all per-sons [11, 12]. In
types [7]. Unskilled forgery occurs when a recent years, numerous automated systems
person forges another indi- vidual’s signature have been developed to verify the authenticity
without possessing knowledge of that person. of handwritten signatures us- ing various
Random forgery involves a person who knows algorithms and methods. Deep learning,
only the signer’s name without having specifically Convolutional Neural Net-works
previously seen its genuine signature. On the (CNNs), has emerged as a dom- inant approach
other hand, skilled forgery is performed by an owing to its effectiveness in image
individual who possesses knowledge of both classification and processing [13, 14]. CNNs,
the signer’s name and the shape of their such as VGGNet, GoogleNet [15], ResNet
genuine signature. These distinctions highlight [16], CapsNet [17], and DenseNet [18] have
the complex- ity and importance of offline demon- strated significant improvements in
signature verification as they play a critical efficiency and performance in real-world
role in safeguarding against fraudulent applications [19, 20]. The performance of
activities. Further CNNs re- lies heavily on their architecture [21,
advancements in this field have the potential to 22]. Experts in this field have designed
enhance security measures and improve the different structures and versions to address
accuracy of signature-verification systems [8]. specific classification problems. However, it is
Handwritten Signature Verification systems challenging to find a CNN model that can
employ two classifications of learning: writer- effectively solve all classification problems.
independent (WI) and writer- dependent (WD) The manual design of CNN architectures
[9, 10]. In the Writer-Independent state, involves iterative attempts to find suitable
learning is performed by all signatures in the parameters that yield the best results, which
database collectively, whereas in the Writer- often requires a substantial amount of time
Dependent state, learning is conducted [23]. Figures (1,2, and 3) show some samples
independently for individual signatures. The from the dataset used.
WI method has gained popularity because it

Fig. 1. Sample of Signatures in BHSig260-Bengali Dataset.

Fig. 2. Sample of Signatures GPDS-300 Dataset.

Fig. 3. Sample Signatures from the CEDAR Dataset.


Offline Signature Verification Using Deep learning and Genetic Algorithm 71

To address this challenge, this study proposes architecture helped reduce the required
a method that utilizes a genetic algorithm to training data volume and resulted in a 13 %-
optimize the hyperparameters of the CNN increase in system efficiency. Genetic
architecture for offline signature verification. algorithms have also been applied to optimize
The genetic algorithm assists in determining CNN architectures. For example, in a study by
the optimal combination of hy- perparameters, [27], two models for predicting the strength of
significantly reducing the time required for adhesively bonded joints were de- signed using
manual design. By leveraging the genetic a CNN. The architecture of one model was
algorithm, the proposed method aims to manually developed, whereas the architecture
enhance the performance and efficiency of the of the other model was opti- mized using a
CNN model for offline signature verification, genetic algorithm. The improved model with
providing more accurate and re- liable results. genetic algorithm optimization demonstrated
better results. In image clas- sification tasks,
II. LITERATURE R EVIEW genetic algorithms have been employed to
In the field of artificial intelligence, particularly optimize CNN architectures using datasets
deep learning, Convolutional Neural Networks such as CIFAR10, MNIST, and Cal-tech256
(CNNs) have been widely used in various [23]. By automatically adjusting the model’s
applications, including computer vision, param- eters, the genetic algorithm improved
pattern recog- nition, and natural language the accuracy compared to the other tested
processing [24]. CNNs consist of several key models. In [28], the authors presented a hybrid
components, including a Convolutional Layer, ap- proach for extracting features from signature
Acti- vating function, pool-ing layer, and fully images. We utilized a Convolutional Neural
connected layer. The Convolutional Layer Network (CNN) and Histogram of Oriented
applies filters (kernels) to extract features or Gradients (HOG) techniques, followed by a
patterns from the input image matrix, and feature-selection algo- rithm (Decision Trees)
multiple filters can be used to capture the to identify important features. The CNN and
different features. The Pooling Layer reduces HOG methods were combined. We evaluated
the size of the matrices by applying functions, the effective- ness of our hybrid approach using
such as Max or Average pooling. The Fully three classifiers: long short-term memory,
connected layer is a multilayer percep- tron, support vector machine, and K-nearest
where neurons are connected to all the nodes of Neighbor. The experimental results
the previous layer and are responsible for the demonstrated that the proposed model per-
final classification. Different ap- proaches formed satisfactorily in terms of efficiency and
have been proposed for offline signature predictive ability. It achieved accuracy rates of
verification. A method known as the Siamese 95.4 %-, 95.2 %-, and 92.7 %- with the UTSig
network was introduced in [25]. It utilizes dataset and 93.7 %-, 94.1 %-, and 91.3 %- with
writer-independent (WI) feature learning and the CEDAR dataset. Another study [29]
measures the similarity or dissimilarity applied a genetic algorithm to select
between Siamese network outputs using the parameters such as the number of filters, filter
Euclidean distance. Another study [26] size, and number of layers added to the
employed a Siamese Neural Network for trainable layers of a CNN transfer model. The
signature verification, training, and evalua- proposed method achieved an accuracy of 97
tion of two similar neural networks on the %- in classify- ing cat and dog datasets over 15
same data. The use of the Siamese network generations. In the domain of finger-vein
72 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

recognition, a system called a Genetic performance and accuracy of the system.


Algorithm with a Convolutional Neural III. METHODOLOGY
Network (GA-CNN) was developed [30]. Offline signature verification is a complex
The GA-CNN system utilizes a genetic pattern-recognition problem that involves
algorithm to initialize the training phase of the recognizing and verifying genuine handwrit-
CNN, resulting in improved accuracy, sensi- ten signatures while detecting forgery
tivity, and precision. Genetic algorithms have attempts. To address this challenge, a
also been used for feature selection in signature comprehensive model for offline signature
verifications. In one study [31], a genetic verifica- tion needs to be developed.
algorithm was employed to select the optimal Convolutional Neural Networks are
set of partial curves and features encoded into particularly suitable architectures for signature
chromosomes for verification. In addition, verification [35]. The proposed model consists
genetic algorithms have been applied to weigh of the following stages. The first stage was the
individual feature components in offline preprocessing stage, in which the signature
signature verification systems[32]. In [33], image was prepared for further analysis. This
four different pattern representation schemes typically involves tasks such as noise removal,
using genetic algorithms were used to image enhancement, and normalization to
determine the weights of feature-based ensure consistent input for the subsequent
classifiers, leading to increased verification stages. The second crucial stage is GA-based
accuracy. Further- more, a model was hyperparameter selection. Hyperpa- rameters
developed for offline signature verification are essential variables that determine the
using CNNs (VGG16, VGG19, and ResNet50) architecture and behavior of the Convolutional
with additional parameters, and trained and Neural Network (CNN). How- ever, manually
tested on the SigComp2009 dataset. The finding optimal hyperparameters is a
VGG16 model demonstrated a high efficiency challenging and time-consuming task. By
of 97 %- com- pared with the other models. employing a genetic algorithm, the model can
In [34], a method was proposed to investigate automatically search for and select the best
the feasibility of employing Genetic combination of hyperparameters, leading to
Algorithms to automatically design CNN improved performance and accu- racy. The
architectures. The Genetic Algorithm third stage involves the CNN itself, which is
generates CNN architectures, which are then responsible for the feature extraction, training,
trained from the be- ginning using a Gradient- and testing. CNNs are powerful deep-learning
Descent Algorithm. The performance of the architectures that excel in the extraction of
evolved CNN architecture was evaluated at meaning- ful features from images. They
each step of the evolutionary process, using a consist of multiple convolutional
validation set. This algorithm does not require and pooling layers that learn the hierarchical
any preprocessing or post-processing of data representations of signature data. The extracted
before or after executing the Genetic features are then utilized for training the model
Algorithm. In summary, this study focuses on on a labeled dataset and for subsequent testing
developing an offline signature verification to evalu- ate the model’s performance in
system us- ing Convolutional Neural Networks signature verification. Each stage within the
(CNNs) in combination with a genetic model comprises multiple steps, such as data
algorithm. A genetic algorithm was employed prepro- cessing techniques, genetic algorithm
to search for the best model architecture initialization and evolution, CNN architecture
hyperparameters and optimize the design, training data preparation, model train-
Offline Signature Verification Using Deep learning and Genetic Algorithm 73

ing, and testing. These steps work in associated steps within each stage of the
conjunction to create an effective and robust proposed model, providing a clear overview of
offline signature verification system. Figure 4 the workflow involved in the offline signature
visually represents the main stages and verification.

Fig. 4. Proposed model architectures


A Signature Images Preprocessing identifying the distinguishing features that
Before starting feature extraction, essential allow the system to differentiate between
processes must be applied to the image authentic and forged signatures [36].
signature. These operations include the fol- Convolutional Neural Networks (CNNs) are a
lowing. type of deep learning model primarily used for
• The color images were converted into image and video analy- sis tasks[37]. They are
grayscale images. designed to automatically learn and extract
• Each image was resized to 100 × 100 meaningful features from input data, making
pixels. them well-suited for tasks such as image
• The image points were read and stored in classification, object detection, and image seg-
an image matrix. mentation. CNNs were inspired by the
B The Feature Extraction with CNN organization of the visual cortex in the human
The primary obstacle in addressing the issue of brain, which contains specialized neurons that
hand-written signature verification lies in respond to specific receptive fields. Similarly,
74 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

CNNs consist of interconnected layers of the loss did not improve after two epochs.
artificial neurons that learn to recognize 2) A 2D convolutional layer is added to the
patterns and spatial hierarchies in the data. The model with a spec- ified number of filters,
key components of a CNN are convolutional, kernel size, and activation function. The input
pooling, and fully connected layers. Here’s shape was set to the provided input shape.
how they work: Convolutional Layers: These 3) Another 2D convolutional layer is added to
layers per- form convolution operations on the model with a specified number of filters,
input data. A convolution involves sliding a kernel size, padding, and activa- tion function.
small window, called a filter or kernel, over the 4) A max pooling layer is added to the model
input and computing dot products between the with a specified pool size.
filter and the local patches of the input. This 5) Steps 2 and 3 are repeated for two more
process captures the local patterns and convolutional layers.
features. Convolutional layers can have 6) Add a dropout layer with a specified
multiple filters to learn different fea- tures dropout rate.
simultaneously. Pooling Layers: Pooling 7) The output is flattened from the previous
layers reduce the spatial dimensions of the data, layers.
helping to make the learned features more 8) Add a dense (fully connected) layer to the
robust and invariant to small translations and model with a spec- ified number of units and
distortions. The most common pooling activation function ‘ReLU’ [38].
operation is max-pooling, which selects the 9) Add another dropout layer with a specified
maximum value within each local region of the drop-out rate.
input. Fully Con- nected Layers: After several 10) A dense output is added to the layer with the
convolutional and pooling layers, the output is activation func- tion ‘Softmax’ [39] and the
flattened and connected to the fully connected specified number of classes and activation
layers. These layers resemble traditional neural function.
networks, in which each neuron is connected 11) Return the trained model using the training
to every neuron in the previous layer. Fully dataset.
connected layers learn global patterns and C The Genetic Algorithm
make predictions based on the extracted AG is used to determine the best combination of
features. During the training process, CNNs hyperparame- ters for the convolutional neural
learn to optimize their internal parameters network model, which can achieve high
(weights and biases) by min- imizing a chosen accuracy in classifying signatures. It begins by
loss function. This is usually performed using randomly ini- tializing a population of network
gradient-based optimization algorithms, such configurations, where each con- figuration
as stochastic gradi- ent descent (SGD) or its representation of genes is a set of
variants. The backpropagation algorithm hyperparameters for the convolutional neural
computes the weight loss gradients, allowing network. The fitness of each network was
for efficient parame- ter updates. The proposed evaluated by training and testing the
model function creates a convolutional neural corresponding convolutional neural network
network (CNN) model based on the parameters model on the signature data. The genetic algo-
provided by the genetic algorithm and trains it rithm then applies selection, crossover, and
using the given training data. Here, is a mutation operations to the population.
breakdown of the steps performed by function: Selection favors networks with higher fitness,
1) We defined an early stopping callback to allowing them to pass their genetic material
monitor the valida- tion loss and stop training if (hyperparameters) to the next generation.
Offline Signature Verification Using Deep learning and Genetic Algorithm 75

Crossover combines the genetic material of process of evaluating the fitness, selecting the
two-parent networks to create new child best networks, generating new networks
networks, potentially inheriting beneficial through crossover, and introducing mutations is
hyperparameter combinations. Mutation in- repeated for multiple generations. The
troduces random changes to the algorithm aims to iteratively improve
hyperparameters of the networks, promoting the population by evolving networks with
the exploration of the search space. This improved performance.
Table I. Random Generated Initial Population.
Hyperparame teR
r ange
Epoch Random (2, 25)
Filter Size Choice (16, 32, 64, 96)
Kernel Size Choice [(3x3), (5x5)]
Unit Choice (128, 256, 512)
Dropout Choice (0, 0.25, 0.50)

a) Initialization The constructor initializes equation represents the fitness of each network
the hyperparame- ters randomly, including the in the population. It trains and tests the
number of epochs, filter size, kernel size, Convolutional Neural Network model with the
dropout rate, activation function, loss given hyperparameters, and calculates its
function, optimizer, and accuracy. This step accuracy. The accuracy was then stored. In
returns a dictionary that contains the current addition, the function prints the accu- racy and
values of the hyperparameters. A CNN model classification reports for each network. The
was built based on the given hyperparameters. fitness function evaluates the fitness of each
It uses a combination of convolutional, network in the network list by training and
pooling, dropout, and dense layers to construct evaluating a convolutional neural network
the model. The model was compiled us- ing a model for each network parameter
specified optimizer, loss function, and metrics. configuration. This is an explanation of the
This method initializes the attributes of an steps performed by the function.
instance with random or predefined values, as 1) Selecting for each one the population
listed in Table I. The attributes used included dictionary from the list.
the following: 2) For convenience, we take the parameter
1) epoch: An integer attribute is randomly values from the population dictionary and
initialized be- tween 1 and 25. assign them to the corre- sponding variables.
2) filter1 and filter2: Integer attributes were 3) We attempted to create and train a CNN
randomly cho- sen from the values 64, 32, and model using the CNN model function with the
16. extracted parameter values and input data.
3) units1: An integer attribute is randomly 4) The performance of the trained model is
chosen from values 128, 256, and 512. evaluated using the evaluation method on the
4) kernel1 and kernel2: Tuple attributes test data, and the accuracy score is stored in the
randomly chosen from the values (3, 3) and (5, accuracy attribute of the network.
5). 5) The accuracy of the model was obtained as a
5) dropout1 and dropout2: Float attributes percentage.
randomly cho- sen from the values 0.25 and 6) Generate predictions using the trained
0.5. model and obtain classification reports
b) Fitness Function The fitness function comparing predicted labels with true labels.
76 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

7) The updated list of networks is returned, to the hyperparameters of the networks within
including the accuracy values for each the population. The mutation process follows
network. these steps for each newly generated child
c) Selection The selection function performs from the previous pro- cess.
selection by sort- ing the population based on 1) A random uniform function is used to
the accuracy of each network and retaining the generate a random number between 0 and 1.
top individuals. The number of individuals 2) The random “ function” was used to
selected was equal to the population size. generate a random integer within the specified
1) The population list is sorted in descending range.
order based on the individual accuracy 3) A random module was imported at the
attributes. beginning of the code to access these
2) The top individuals are selected from the functions.
sorted popu- lation based on a specified 4) The mutation process remains the same,
percentage or number. where the “epoch” and “units” attributes of
3) The selected population is then returned. each individual are modified if the generated
d) Crossover The crossover function is random number is less than or equal to 0.1.
responsible for perform- ing a crossover by 5) Finally, the modified population was
randomly selecting two parent networks from returned.
the population and creating two child D Training and Testing Stage
networks. The cross-over process follows In the training and testing stages of the
these steps for each of the two parents from the signature verification process, our objective
population, using a selection process: was to develop an efficient model for of- fline
1) The total number of attributes signature verification using a genetic
(hyperparameters) in the parents was divided algorithm and evaluate its performance on
into half. multiple datasets. Table II shows lists of the
2) Take the first half of the attributes from the datasets used in the proposed model. Our
first parent and assign them to the study employed Con-
corresponding attributes of the second child.
3) Take the first half of the attributes from the
second parent and assign them to the
corresponding attributes of the first child.
4) Combine the offspring list, which includes
two newly created child networks, with the
current population, forming a new population.
5) Return to the new population.
This process enables the exchange of genetic
information be- tween parent networks,
allowing the child networks to inherit certain
hyperparameters from their parents. By
combining attributes from different parents,
crossover promotes the ex- ploration and
exploitation of potential solutions within a pop-
ulation.
e) Mutation The mutation function is
responsible for introduc- ing random mutations
Offline Signature Verification Using Deep learning and Genetic Algorithm 77

Table II. Dataset Used in the Proposed System.

Type GPDS-300 CEDAR Bengali Hindi


Signers 300 100 100 160
Genuine 24 24 24 24
Forged 30 24 30 30
Training 10200 1540 3400 5440
Testing 6000 1100 2000 3200

volutional Neural Networks (CNNs), a deep IV. RESULTS


learning technique known for its effectiveness During testing, we computed various
in feature extraction for signature ver- performance metrics, such as the False
ification. However, manually designing CNN Rejection Rate (FRR), False Acceptance Rate
models often leads to suboptimal results (FAR), Equal Error Rate (EER), and overall
because of the challenge of determining op- accuracy, to assess the effectiveness of our
timal architecture and hyperparameters. To method. Our experimental results demon-
address this challenge, we propose the use of a strated impressive discrimination rates, with
genetic algorithm, inspired by natural an FRR of 2.5%-, FAR of 3.2%-, EER of
selection to evolve a population of CNN 2.35%-, and accuracy rate of 97.73%-. These
models. The genetic algorithm explores findings highlight the effectiveness of our GA-
different combinations of architectural con- based ap- proach in designing highly efficient
figurations, such as the number and size of and accurate offline signature verification
convolutional layers, pooling layers, fully models. The results are summarized in Table
connected layers, activation functions, and III, which presents the parameters and
regularization techniques. The fitness of each accuracy during the testing stage. Our study
CNN model was evaluated using a fitness underscores the significance of leveraging
function that considers metrics such as genetic algorithms to optimize CNN
accuracy, loss, and convergence speed. The architectures in signature verification systems.
algorithm selects the most promising models The superior performance of our proposed
based on their fitness scores and applies method has the poten- tial for various real-
genetic operators such as crossover and world applications that require reliable and
mutation to create a new generation of models. non-intrusive signature verification.
This evolutionary process continues, gradu- V. DISCUSSION
ally improving the fitness of the models until Each dataset was subdivided into two parts: a
the genetic algorithm identifies the CNN training set and testing. The performance of
architecture with the best performance on the the proposed system was measured using three
training dataset. Once the optimal architecture global scales which are as follows: Accuracy
is determined, we proceed to the testing phase, is the ratio of the number of correctly
where we evaluate the selected model on categorized signatures to the total number of
multiple datasets, including BHSig260- complete signatures. These are the False
Bengali, BHSig260- Hindi, GPDS[40], and Acceptance Rate (FAR) and False Rejection
CEDAR, to ensure the robustness and gen- Rate (FRR), which are the pres- ence of the
eralization of our approach. forgeries signatures that are incorrectly
78 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

classified. An equal Error Rate (EER) is The ”Surround- edness Features” meth-od
applied to evaluate the equilibrium point where [11] achieves a False Acceptance Rate (FAR)
the FRR equals the FAR. A lower EER indicates of 8.33%-, False Rejection Rate (FRR) of
a better per- formance for the model. The 8.33%-, Equal Error Rate (EER) of 8.33%-,
results obtained from the proposed method of and accuracy of 91.67%-. On the other hand,
constructing the CNN model using the genetic the ”Multi-Path Siamese (MA-SCN)” method
algorithm were compared with those of other [41] yields an FRR of 18.35%-, FAR of
methods using hand-built CNN models. The 19.21%-, EER of 18.92%-, and accuracy of
results were compared with those of other 80.75%-. Additionally, the ”Siamese CNN”
studies. Table IV. presents a comparison of method
the performance of different methods with our [42] achieves a FAR of 6.78%-, FRR of
method on the CEDAR dataset in terms of 4.20%-, and accuracy of 95.66%-. In
FAR, FRR, EER, and accuracy. ”Our method” comparison, our proposed method
outperforms the other methods in terms of outperformed these approaches with a FAR of
FAR, FRR, EER, and accuracy, indicating its 2.5%-, FRR of 2.2%-, EER of 2.35%-, and
superior performance on the CEDAR dataset. accuracy of 97.73%-.
Table III. The Parameter Setting of The
Proposed System.

Dataset Name
Parameter
GPDS-300 CEDAR BHSig260-B BHSig260-H
Max Epochs 20 21 12 14
Parameters 2,402,731 2,545,911 2,426,120 1,241,250
Layer 1 Conv2D (32, 3x3) (32, 3x3) (64, 3x3) (32, 3x3)
Layer 2 & MaxPool(2, 2) Conv2D (32, 3x3) Conv2D (64, 3x3) Conv2D (32, 3x3) Conv2D (32, 3x3)
Layer 3 & MaxPool(2, 2) Conv2D (32, 3x3) Conv2D (32, 3x3) Conv2D (64, 3x3) Conv2D (32, 3x3)
Layer 4 & MaxPool(2, 2) Conv2D (32, 3x3) Conv2D (64, 3x3) Conv2D (32, 3x3) Conv2D (32, 3x3)
Dropout 0.50 0.25 0.25 0.50
Flatten 512 264 512 256
Accuracy 0.93 0.977 0.958 0.922
Table IV. Comparison Results for CEDAR Dataset.

CEDAR
Method
FAR FRR EER
Surroundedness Features [11] 8.33 8.33 8.33
Multi-Path (MA-SCN) [41] 19.21 18.35 18.92
Siamese CNN [42] 6.78 4.20 –
Our method 2.5 2.2 2.35

Table V provides a comparison of different for the ”GoogLeNet Inception-v1 and


methods, includ- ing ”CNN-GP” [43], Inception-v3” methods is not provided. In
”GoogLeNet Inception-v1 and Inception-v3” con- trast, our method achieved a FAR of
[44], and our method, on the GPDS-300 9.1%-. The ”CNN-GP” method has an FRR of
dataset in terms of FAR, FRR, EER, and 20.60%-, the specific FRR value for the
accuracy. The ”CNN-GP” method achieves a ”GoogLeNet Inception-v1 and Inception-v3”
FAR of 9.08%-, while the specific FAR value methods is not pro- vided, and our method
Offline Signature Verification Using Deep learning and Genetic Algorithm 79

achieves an FRR of 20%-. Furthermore, the Inception-v3” methods have an accuracy of


”CNN-GP” method has an EER of 12.83%-, 72%-, and our method achieves an accuracy of
the ”GoogLeNet Inception-v1 and Inception- 93%-. Figures (5,6,7 and 8) show the graphs of
v3” methods have an EER of 26%-, whereas the curves of the results of the proposed study,
our method achieves an EER of 11%-. Finally, where the curves are loss, val-loss, val-
the ”CNN- GP” method has an accuracy of accuracy, and accuracy with the used dataset.
92%-, the ”GoogLeNet Inception- v1 and
Table V. Comparison Results for GPDS-300.

GPDS-300
Method
FAR FRR EER
CNN-GP [43] 9.08 20.60 12.83
GoogLeNet V1 and V3 [44] – – 26
Our method 9.1 20 11

Tables VI and VII compares the results of this 6.65%-. In contrast, our method achieved FRR
study with the performance of our method on values of 2.1%- and 4.7%- for BHSig260-B
the BHSig260-B and BHSig260- H datasets. and BHSig260-H, respectively. For the EER
The ”Multi-Path Siamese (MA-SCN)” [41] met- ric, the ”Multi-Path Siamese (MA-SCN)”
method method achieved rates of 8.18%- and 5.32%-
for BHSig260-B and BHSig260-H, respec-
achieves FAR values of 5.73%- and 9.96%- for
tively. The ”Sia-mese CNN” method does not
BHSig260-B and BHSig260-H, respectively.
provide a spe- cific EER value, and the ”Multi-
The ”Siamese CNN” [?] method has FAR
scripted with CNN” method lacks this
values of 14.25%- and 12.29%- for the
information. In comparison, the proposed
respective datasets. For the ”Multi-scripted
method achieved EER values of 1.7%- and
with CNN” [10] method, the FAR val- ues
5.2%- for the respective datasets. In terms of
were 1.50%- and 2.31%- for BHSig260-B and
accuracy, the ”Multi-Path Siamese (MA-
BHSig260- H, respectively. In contrast, our
SCN)” method achieved accuracy rates of
method achieved FAR values of 1.3%- and
94.99%- for BHSig260-B and 92%- for
6.8%- for the same dataset. Regarding the
BHSig260-H. The ”Siamese CNN” method
FRR, the ”Multi-Path Siamese (MA-SCN)”
achieves accuracy rates of 90.64%- and
method achieved rates of 4.86%- and 5.85%-
88.98%- for the respective datasets. The
for BHSig260-B and BHSig260-H, respec-
”Multi-scripted with CNN” method achieves
tively. The ”Sia-mese CNN” method has FRR
accuracy rates of 95%- and 90%-. Our method
values of 6.41%- and 9.6%- for the respective
outperformed the other methods, achieving the
datasets. The ”Multi-scripted with CNN”
highest accuracy rates of 95.82%- for
method achieves FRR values of 3.14%- and
BHSig260-B and 92.26%- for BHSig260-H.
80 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

Table VI. Comparison Results for BHSig260-B

BHSig260-B
Method
FAR FRR EER
Multi-Path Siamese (MA-SCN) [41] 5.73 4.86 8.18
Siamese CNN [42] 14.25 6.41 –
Multi-scripted with CNN [10] 1.50 3.14 –
Our method 1.3 2.1 1.7
Table VII. Comparison Results for BHSig260-H

BHSig260-H
Method
FAR FRR EER
Multi-Path Siamese (MA-SCN) [41] 9.96 5.85 5.32
Siamese CNN [42] 12.29 9.6 –
Multi-scripted with CNN [10] 2.31 6.65 –
Our method 6.8 4.7 5.2

Fig. 5. The Resulting CEDAR Dataset

Fig. 6. The Resulting BHSig260-H.


Offline Signature Verification Using Deep learning and Genetic Algorithm 81

Fig. 7. The Resulting BHSig260-B.


CONCLUSION
VI.
This study highlights the significance of the signature verifica- tion process in various applications,
such as financial operations,

Fig. 8. The Resulting GPDS-300.


electronic document signing, and identity model was evaluated using multiple datasets
verification in computer systems. Compared including BHSig260-Bengali, BHSig260-
with other biological methods, signature ver- Hindiin, GPDS, and CEDAR. The results of the
ification offers community acceptance and is proposed approach demonstrated its
less invasive. Deep learning (DL) and effectiveness, with the highest discrimination
Convolutional Neural Networks (CNNs) have rates achieved. The False Rejection Rate
significantly contributed to the advancement (FRR) was 2.5%-, False Acceptance Rate
of signature verifi- cation systems by (FAR) was 3.2%-, Equal Error Rate (EER) was
effectively extracting features from signatures. 2.35%-, and accuracy rate was 97.73%-. In
However, the optimization of hyperparameters summary, the utiliza- tion of a genetic
for CNN models remains a challenging task in algorithm for optimizing the architecture of
the design of highly efficient models with CNN models in signature verification leads to
accurate results. Currently, CNN models are improved discrimination rates and accuracy.
predominantly manually designed, which can This study contributes to the development of
be time-consuming and may not yield the best highly efficient offline signature verification
possible outcomes. To address this challenge, systems with poten- tial applications in various
the proposed method utilizes a genetic domains. Future work could focus on further
algorithm to develop a pop- ulation of CNN enhancing the proposed method by exploring
models and identify the most suitable architec- additional techniques for hyperparameter
ture for offline signature verification. The optimization. In addition, inves- tigating the
82 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

generalizability of the developed model by performance on streaming data can be explored


testing it on larger and more diverse datasets to enhance its practical applicability in real-
would be valuable. Moreover, considering the world scenarios.
robustness of the model against various types REFERENCES
of signature forgeries and exploring methods to [1] A. K. Jain, A. Ross, and K.
mitigate potential vul- nerabilities is an Nandakumar, Introduction to Biometrics.
important direction for future research. Springer, 2016.
Finally, incorporating real-time processing
capabilities and evaluating the model’s
[2] D. Impedovo and G. Pirlo, “Automatic verification,” in Iberoamerican Congress on
signature verification: the state of the art,” Pattern Recognition, pp. 227–234, Springer,
IEEE Transactions on Systems, Man, and 2009.
Cybernetics, Part C (Applications and [9] Y. Muhtar, W. Kang, A. Rexit, and K.
Reviews), vol. 38, Ubul, “A survey of offline handwritten
no. 5, pp. 609–635, 2008. signature verification based on deep learning,”
[3] C.-L. Liu and Y. H. Yin, “Offline in 2022 3rd International Conference on
handwritten signature verification–literature Pattern Recognition and Machine Learning in
review,” Pattern Recognition, vol. 40, no. 8, PRML, pp. 391–397, IEEE, July 2022.
pp. 2293–2307, 2007. [10] T. Longjam, D. R. Kisku, and P. Gupta,
[4] M. A. Ferrer and C. M. Travieso, “Multi-scripted writer independent off-line
“Offline signature veri- fication: An overview signature verification using convolu- tional
and some recent advances,” Pattern neural network,” Multimedia Tools and
Recognition Letters, vol. 34, no. 3, pp. 249– Applications,
256, 2013. pp. 1–18, Aug. 2022.
[5] B. M. Al-Maqaleh and A. M. Musleh, [11] S. Pal, A. Alaei, U. Pal, and M.
“An efficient offline signature verification Blumenstein, “Performance of an off-line
system using local features,” Interna- tional signature verification method based on texture
Journal of Computer Applications, vol. 131, features on a large indicscript signature dataset,”
no. 10, in 12th IAPR Workshop on Document Analysis
pp. 39–44, 2015. Systems, pp. 72–77, IEEE, April 2016.
[6] L. G. Hafemann, R. Sabourin, and L. S. [12] A. Foroozandeh, A. Hemmat, and H.
Oliveira, “Of- fline handwritten signature Rabbani, “Offline hand- written signature
verification—literature review,” in Seventh verification and recognition based on deep
International Conference on Image transfer learning,” in International Conference
Processing Theory, Tools and Applications on Machine Vision and Image Processing
(IPTA), IEEE, Nov. 1-8 2017. (MVIP), pp. 1–7, IEEE, Feb. 2020.
[7] R. Verma and D. Rao, “Offline signature [13] N. Sharma, S. Gupta, P. Mehta, X.
verification and identification using angle Cheng, A. Shankar,
feature and pixel density feature and both P. Singh, and S. R. Nayak, “Offline signature
method together,” International Journal of verification using deep neural network with
Soft Computing and Engineering, vol. 2, no. application to computer vi- sion,” Journal of
4, pp. 740–746, Electronic Imaging, vol. 31, pp. 041210– 1–
2013. 041210–10, Jul. 2022.
[8] L. V. Batista, D. Rivard, R. Sabourin, and [14] Y. LeCun, Y. Bengio, and G. Hinton,
P. Maupin, “State of the art in off-line signature “Deep learning,” Na- ture, vol. 521, pp. 436–
Offline Signature Verification Using Deep learning and Genetic Algorithm 83

444, May 2015. Crawford, R. Soto, and R. Ñ anculef,


[15] C. Szegedy, W. Liu, Y. Jia, P. “Automating configuration of convolutional
Sermanet, S. Reed, neural network hyperparameters using genetic
D. Anguelov, and A. Rabinovich, “Going algorithm,” IEEE Access, vol. 8, pp. 156139–
deeper with convo- lutions,” in Proceedings of 156152, 2020.
the IEEE Conference on Computer Vision and [24] J. Donahue, L. Anne, S. Guadarrama,
Pattern Recognition, pp. 1–9, 2015. M. Venugopalan,
[16] K. He, X. Zhang, S. Ren, and J. Sun, S. Rohrbach, K. Saenko, and T. Darrell, “Long-
“Deep residual learning for image term recurrent convolutional networks for
recognition,” in Proceedings of the IEEE visual recognition and descrip- tion,” in
Confer- ence on Computer Vision and Pattern Proceedings of the IEEE Conference on
Recognition, pp. 770– Computer Vision and Pattern Recognition, pp.
778, 2016. 2625–2634, 2015.
[17] E. Parcham, M. Ilbeygi, and M. Amini, [25] S. Dey, A. Dutta, J. I. Toledo, S. K.
“Cbcapsnet: A novel writer-independent Ghosh, J. Llado´s, and
offline signature verification model us- ing a U. Pal, “Signet: Convolutional siamese
cnn-based architecture and capsule neural network for writer independent offline
networks,” Expert Systems with Applications, signature verification,” arXiv preprint
p. 115649, Dec. 2021. arXiv:1707.02131, 2017.
[18] G. Huang, Z. Liu, K. Q. Weinberger, and [26] C. Yinka-Banjo and C. Okoli, “Signature
L. van der Maaten, “Densely connected verification using siamese convolutional
convolutional networks,” in Proceedings of the neural networks,” Covenant Journal of
2017 IEEE Conference on Computer Vision and Informatics and Communication Technology,
Pattern Recognition, (Honolulu, HI, USA), pp. 2019.
2261–2269, 2017. [27] E. G. Arhore, M. Yasaee, and I. Dayyani,
[19] A. Krizhevsky, I. Sutskever, and G. E. “Optimisation of convolutional neural network
Hinton, “Imagenet classification with deep architecture using genetic algo- rithm for the
convolutional neural networks,” prediction of adhesively bonded joint
Communications of the ACM, vol. 60, no. 6, pp. strength,” Structural and Multidisciplinary
84–90, 2017. Optimization, vol. 65, no. 9,
[20] C. Clark and A. Storkey, “Training deep pp. 1–16, 2022.
convolutional neural networks to play go,” in [28] F. M. Alsuhimat and F. S. Mohamad, “A
International Conference on Machine Learning, hybrid method of feature extraction for
pp. 1766–1774, PMLR, Jun. 2015. signatures verification using cnn and hog: A
[21] N. Purohit, S. Purohit, and C. S. Satsangi, multi-classification approach,” IEEE Access,
“Offline handwrit- ten signature verification vol. 11,
using template matching and clus- tering pp. 21873–21882, 2023.
technique,” International Journal of Computer [29] C. Li, J. Jiang, Y. Zhao, R. Li, E.
Science and Mobile Computing, vol. 2, pp. Wang, X. Zhang, and
295–301, Apr. 2014. K. Zhao, “Genetic algorithm based hyper-
[22] K. Simonyan and A. Zisserman, “Very parameters opti- mization for transfer
deep convolutional networks for large-scale convolutional neural network,” in In-
image recognition,” arXiv preprint, ternational Conference on Advanced
pp. 1409–1556, Sep. 2014. Algorithms and Neural Networks (AANN
[23] F. Johnson, A. Valderrama, C. Valle, B. 2022), vol. 12285, pp. 232–241, SPIE, Jun.
84 Abdoulwase M. Obaid Al-Azzani, and Abdulbaset M. Qaid Musleh

2022. and genetic algorithm,” IEEE Access, vol. 1,


[30] O. M. Assim and A. M. Alkababji, “Cnn no. 3,
and genetic al- gorithm for finger vein pp. 1–1, 2023.
recognition,” in 2021 14th Interna- tional [38] V. Nair and G. E. Hinton, “Rectified
Conference on Developments in Systems linear units improve restricted boltzmann
Engineering (DeSE), pp. 503–508, IEEE, Dec. machines,” in Proceedings of the 27th
2021. International Conference on Machine
[31] X. Yang, X. Zeng, H. Fu, and Y. Zhang, Learning (ICML-10),
“Selection of features for signature verification pp. 807–814, 2010.
using the genetic algorithm,” Com- puters and [39] I. Goodfellow, Y. Bengio, and A.
Industrial Engineering, vol. 30, no. 4, pp. Courville, Deep Learning. MIT Press, 2016.
1037– 1045, 1996. [40] M. A. Ferrer, J. F. Vargas, A. Morales,
[32] V. E. Ramesh and M. Narasimha, “Off- and A. Ordonez, “Robustness of offline
line signature verifica- tion using genetically signature verification based on gray level
optimized weighted features,” Pattern features,” IEEE Transactions on Information
Recognition, vol. 32, no. 2, pp. 217–233, 1999. Forensics and Security, vol. 7, no. 3, pp. 966–
[33] D. P. Sudharshan and R. N. Vismaya, 977, 2012.
“Handwritten signa- ture verification system [41] X. Zhang, Z. Wu, L. Xie, Y. Li, F. Li, and
using deep learning,” in 2022 IEEE J. Zhang, “Multi- path siamese convolution
International Conference on Data Science and network for offline handwritten sig- nature
Information System (ICDSIS), pp. 1–5, IEEE, verification,” in 2022 The 8th International
Jul. 2022. Confer- ence on Computing and Data
[34] A. S. Mondal, “Evolution of convolution Engineering, pp. 51–58, Jan. 2022.
neural network ar- chitectures using genetic [42] W. Xiao and Y. Ding, “A two-stage
algorithm,” in 2020 IEEE Congress on siamese network model for offline handwritten
Evolutionary Computation (CEC), pp. 1–8, signature verification,” Symmetry, vol. 14, no.
IEEE, Jul. 2020. 6, p. 1216, 2022.
[35] L. G. Hafemann, R. Sabourin, and L. S. [43] L. G. Hafemann, R. Sabourin, and L. S.
Oliveira, “Learning features for offline Oliveira, “Writer- independent feature
handwritten signature verification using deep learning for offline signature verifica- tion
convolutional neural networks,” Pattern using deep convolutional neural networks,” in
Recognition, vol. 70, pp. 163–176, 2017. 2016 In- ternational Joint Conference on
[36] V. Malekian, A. Aghaei, M. Rezaeian, Neural Networks (IJCNN),
and M. Alian, “Rapid offline signature pp. 2576–2583, IEEE, 2016.
verification based on signature envelope and [44] S. M. Sam, K. Kamardin, N. N. A. Sjarif,
adaptive density partitioning,” in 2013 First and N. Mo- hamed, “Offline signature
Iranian Confer- ence on Pattern Recognition verification using deep learning convolutional
and Image Analysis (PRIA), neural network (cnn) architectures googlenet
pp. 1–6, IEEE, Mar. 2013. inception-v1 and inception-v3,” Procedia
[37] A. M. Q. Musleh and A. M. O. Al- Computer Science, vol. 161, pp. 475–483,
Azzani, “Developing a model for offline 2019.
signature verification using cnn architec- tures
‫‪Offline Signature Verification Using Deep learning and Genetic Algorithm‬‬ ‫‪85‬‬

‫التحقق من التوقيع دون اتصال باإلنترنت باستخدام التعلم العميق‬


‫والخوارزمية الجينية‬

‫‪2‬‬
‫عبدالواسع محمد عبيد العزاني ‪ ،1‬عبدالباسط محمد قايد مصلح‬

‫‪ 1‬قسم علوم الحاسبات‪ ،‬كلية الحاسبات وتقنية المعلومات‬


‫جامعة الملك صنعاء‪ ،‬اليمن‬
‫‪Aledresi200@yehoo.com‬‬

‫مستخلص‪ .‬إن عملية التحقق من التوقيعات لها تطبيقات واسعة النطاق في أنظمة الكمبيوتر‪ ،‬بما في ذلك العمليات‬
‫المالية‪ ،‬التوقيع اإللكتروني للمستندات والتحقق من هوية المستخدم‪ .‬يتمتع هذا النهج بميزة قبول المجتمع ويقدم بديالً‬
‫أقل تدخالً من طرق المصادقة البيولوجية األخرى‪ .‬التعلم العميق والعصبية التالفيفية برزت الشبكات كأدوات بارزة‬
‫في مجال لتحقق من التوقيع مما أدى إلى تعزيز دقة وفعالية هذه األنظمة بشكل كبير من خالل استخالص الميزات‬
‫التمييزية بشكل فعال من صور التوقيع‪ .‬ومع ذلك‪ ،‬يظل تحسين المعلمات الفائقة في نماذج ث مهمة صعبة‪ ،‬ألنه يؤثر‬
‫بشكل مباشر على كفاءة النماذج ودقتها‪ .‬وحالياً‪ ،‬يعتمد تصميم بنيات بشكل كبير على التعديالت اليدوية‪ ،‬والتي يمكن‬
‫أن تستغرق وقتا ً طويالً ربما أيضا ً ال تسفر عن النتائج المثلى‪ .‬ولمعالجة هذه المشكلة‪ ،‬تركز الطريقة المقترحة على‬
‫استخدام خوارزمية جينية للتطور مجموعة من نماذج‪ ،‬مما يتيح االكتشاف التلقائي للبنية األكثر مالءمة للتوقيع دون‬
‫اتصال باألنترنت تحقق‪ .‬من خالل االستفادة من قدرات التحسين للخوارزمية الجينية‪ ،‬يهدف النهج المقترح إلى تحسين‬
‫األداء العام وفعالية نموذج التحقق من التوقيع‪ .‬تم تقييم فعالية الطريقة المقترحة باستخدام مجموعة بيانات متعددة‪ ،‬من‬
‫خالل اختبارات صارمة‪ ،‬حقق النهج معدالت تمييز ملحوظة مع معدل رفض كاذب بنسبة ‪ ٥٠٢‬ومعدل قبول كاذب‬
‫بنسبة ‪ ،٢٠٣‬ومعدل خطأ مسا ٍو بنسبة ‪ ، ٣٥٠٢‬ومعدل ودقة قدرة ‪٠٠٧٣٠٩٧‬‬
‫الكلمات المفتاحيةـ ـ التحقق من التوقيع دون االتصال‪ ،‬التعلم العميق‪ ،‬والخوارزمية الجينية‪.‬‬

You might also like