A Fine-Tuned Transfer Learning Model Based on CNN Deep Learning for Handwritten Character and Numeric Recognition with Performance Evaluation

Main Article Content

Anuj Bhardwaj

Abstract

In order to rapidly build an automatic and precise system for image recognition and
categorization, deep learning plays role as a vital technology. Handwritten character
classification also gaining more attention due to its major contribution in automation and
specially to develop applications for helping visually impaired people. This paper intends to
utilize and alter the current pre-trained deep convolutional neural network (DCNN) based
model to identify the Devanagari handwritten character with better augmentation techniques.
We used publically available devanagari handwritten characters dataset, that consist of total
5800 isolated images of 58 unique character classes: 12 vowels, 36 consonants and 10
numerals. In this paper, ten distinctive pre-trained CNN models, ResNet50, AlexNet,
DenseNet 169, DenseNet 121, DenseNet 201, Vgg 11, Vgg 16, Vgg 19, InceptionResNetV2,
and Inception V3 are trained on devanagari fhandwritten characters dataset. The model
performances are assessed on precision, recall, F1 score, accuracy, and auroc score. The
results are computed and analyzed based on their valid accuracy on #1st epoch and #10th
epoch value. Results represents that after #1st epoch the accuracy variation is from 73.10% to
99.07%, while in #10th epoch it is from 89.02% to 99.11%. Inception V3 model outperformed
for the considered dataset with 99.09 accuracy.

Article Details

Section
Articles