ENSEMBLING DEEP CONVOLUTIONAL NEURAL NEWORKS FOR BALINESE HANDWRITTEN CHARACTER RECOGNITION

Authors

  • Desak Ayu Sista Dewi Department of Industrial Engineering, Faculty of Engineering, Universitas Udayana, Bali, Indonesia
  • Dewa Made Sri Arsa Department of Information Technology, Faculty of Engineering, Universitas Udayana, Bali, Indonesia
  • Gusti Agung Ayu Putri Department of Information Technology, Faculty of Engineering, Universitas Udayana, Bali, Indonesia
  • Ni Luh Putu Lilis Sinta Setiawati Department of Industrial Engineering, Faculty of Engineering, Universitas Udayana, Bali, Indonesia

DOI:

https://doi.org/10.11113/aej.v13.19582

Keywords:

Balinese handwritten character, convolutional neural network, ensemble deep learning, recognition, softmax

Abstract

While deep learning has proven its performance in various problems and applications, it also opens opportunities in a new way to promote the heterogeneity of cultures and heritages. Balinese script is a cultural heritage in Bali, where it is used to write on palm-leaf manuscripts and contains essential information. Most manuscripts were damaged due to age and lack of maintenance, so a digitalization technique should be developed. In this study, we propose an ensemble of deep convolutional neural networks to recognize the handwritten characters in the Balinese script. We extensively compared various deep convolutional neural network architectures, and the results showed that our ensemble methods achieved the state of the art.

Author Biographies

  • Desak Ayu Sista Dewi, Department of Industrial Engineering, Faculty of Engineering, Universitas Udayana, Bali, Indonesia

    -

  • Gusti Agung Ayu Putri, Department of Information Technology, Faculty of Engineering, Universitas Udayana, Bali, Indonesia

    -

  • Ni Luh Putu Lilis Sinta Setiawati, Department of Industrial Engineering, Faculty of Engineering, Universitas Udayana, Bali, Indonesia

    -

References

T. Wang, Z. Xie, Z. Li, L. Jin, and X. Chen, 2019. “Radical aggregation network for few-shot offline handwritten chinese character recognition,” Pattern Recognition Letters. 125:821–827.

J. I. Olszewska, 2015. “Active contour based optical character recognition for automated scene understanding,” Neurocomputing. 161:65–71.

M. W. A. Kesiman, J.-C. Burie, G. N. M. A. Wibawantara, I. M. G. Sunarya, and J.-M. Ogier, 2016. “Amadi lontarset: The first handwritten balinese palm leaf manuscripts dataset,” in 15th International Conference on Frontiers in Handwriting Recognition (ICFHR), IEEE. 168–173.

M. W. A. Kesiman, D. Valy, J.-C. Burie, E. Paulus, M. Suryani, S. Hadi, M. Verleysen, S. Chhun, and J.-M. Ogier, 2018. “Benchmarking of document image analysis tasks for palm leaf manuscripts from southeast asia,” Journal of Imaging. 4(2): 43.

D. Matthew and R. Fergus, 2014. “Visualizing and understanding convolutional neural networks,” in Proceedings of the 13th European Conference Computer Vision and Pattern Recognition, Zurich, Switzerland. 6–12.

M. A. Islam and I. E. Iacob, 2023. “Manuscripts character recognition using machine learning and deep learning,” Modelling. 4(2):168– 188.

X. Wang, Y. Li, J. Liu, J. Zhang, X. Du, L. Liu, and Y. Liu, 2022. “In- telligent micron optical character recognition of dfb chip using deep convolutional neural network,” IEEE Transactions on Instrumentation and Measurement. 71: 1–9.

G.-F. Luo, D.-H. Wang, X. Du, H.-Y. Yin, X.-Y. Zhang, and S. Zhu, 2023. “Self-information of radicals: A new clue for zero-shot chinese character recognition,” Pattern Recognition. 140: 109598.

J. Gan, Y. Chen, B. Hu, J. Leng, W. Wang, and X. Gao, 2023. “Characters as graphs: Interpretable handwritten chinese character recognition via pyramid graph transformer,” Pattern Recognition. 137: 109317.

Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, 1998. “Gradient-based learning applied to document recognition,” Proceedings of the IEEE. 86(11): 2278–2324.

S. An, M. Lee, S. Park, H. Yang, and J. So, 2020. “An ensemble of simple convolutional neural network models for mnist digit recognition,” arXiv preprint arXiv:2008.10400.

A. Byerly, T. Kalganova, and I. Dear, 2021. “No routing needed between capsules,” Neurocomputing. 463: 545–553.

D. Ghosh, T. Dube, and A. Shivaprasad, 2010. “Script recognition—a review,” IEEE Transactions On Pattern Analysis And Machine Intelligence. 32(12): 2142–2161.

J. Burie, M. Coustaty, S. Hadi, M. W. A. Kesiman, J. Ogier, E. Paulus, K. Sok, I. M. G. Sunarya, and D. Valy, 2016. “Icfhr 2016 competition on the analysis of handwritten text in images of balinese palm leaf manuscripts,” in 2016 15th International Conference on Frontiers in Handwriting Recognition (ICFHR). 596–601.

M. Sudarma and I. W. A. Surya, 2014. “The identification of balinese scripts’ characters based on semantic feature and k nearest neighbor,” International Journal of Computer Applications. 91(1).

M. W. A. Kesiman, S. Prum, J.-C. Burie, and J.-M. Ogier, 2016. “Study on feature extraction methods for character recognition of balinese script on palm leaf manuscript images,” in 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE. 4017–4022.

D. M. S. Arsa, G. A. A. Putri, R. Zen, and S. Bressan, 2020. “Isolated handwritten balinese character recognition from palm leaf manuscripts with residual convolutional neural networks,” in 2020 12th International Conference on Knowledge and Systems Engineering (KSE). IEEE. 224–229.

K. He, X. Zhang, S. Ren, and J. Sun, 2016. “Deep residual learning for image recognition,” in Proceedings Of The IEEE Conference On Computer Vision And Pattern Recognition. 770–778.

M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L.-C. Chen, 2018. “Mobilenetv2: Inverted residuals and linear bottlenecks,” in Proceedings Of The IEEE Conference On Computer Vision And Pattern Recognition. 4510–4520.

A. Howard, M. Sandler, G. Chu, L.-C. Chen, B. Chen, M. Tan, W. Wang, Y. Zhu, R. Pang, V. Vasudevan et al., 2019. “Searching for mobilenetv3,” in Proceedings Of The IEEE/CVF International Conference On Computer Vision. 1314–1324.

M. Tan and Q. Le, 2019. “Efficientnet: Rethinking model scaling for con- volutional neural networks,” in International conference on machine learning. PMLR. 6105–6114.

M. Tan and Q. Le, 2021. “Efficientnetv2: Smaller models and faster training,” in International Conference on Machine Learning, PMLR. 10096–10106.

Z. Liu, Y. Lin, Y. Cao, H. Hu, Y. Wei, Z. Zhang, S. Lin, and B. Guo, 2021. “Swin transformer: Hierarchical vision transformer using shifted windows,” in Proceedings of the IEEE/CVF International Conference on Computer Vision. 10012–10022.

J. Xing, Z. Ruixi, R. Zen, D. M. S. Arsa, I. Khalil, and S. Bressan, 2019. “Building extraction from google earth images,” in Proceedings of the 21st International Conference on Information Integration and Web-based Applications & Services. 502–511.

T.-H. Tsai and Y.-W. Tseng, 2023. “Bisenet v3: Bilateral segmentation network with coordinate attention for real-time semantic segmentation,” Neurocomputing. 532: 33–42.

A. Moudgil, S. Singh, V. Gautam, S. Rani, and S. H. Shah, 2023. “Handwritten devanagari manuscript characters recognition using capsnet,” International Journal of Cognitive Computing in Engineering. 4: 47–54.

H. Cao, Y. Wang, J. Chen, D. Jiang, X. Zhang, Q. Tian, and M. Wang, 2023. “Swin-unet: Unet-like pure transformer for medical image segmenta- tion,” in Computer Vision–ECCV 2022 Workshops: Tel Aviv

Downloads

Published

2023-08-30

Issue

Section

Articles

How to Cite

ENSEMBLING DEEP CONVOLUTIONAL NEURAL NEWORKS FOR BALINESE HANDWRITTEN CHARACTER RECOGNITION. (2023). ASEAN Engineering Journal, 13(3), 133-139. https://doi.org/10.11113/aej.v13.19582