首页    期刊浏览 2024年09月16日 星期一
登录注册

文章基本信息

  • 标题:Transformer-based Models for Arabic Online Handwriting Recognition
  • 本地全文:下载
  • 作者:Fakhraddin Alwajih ; Eman Badr ; Sherif Abdou
  • 期刊名称:International Journal of Advanced Computer Science and Applications(IJACSA)
  • 印刷版ISSN:2158-107X
  • 电子版ISSN:2156-5570
  • 出版年度:2022
  • 卷号:13
  • 期号:5
  • DOI:10.14569/IJACSA.2022.01305102
  • 语种:English
  • 出版社:Science and Information Society (SAI)
  • 摘要:Transformer neural networks have increasingly be-come the neural network design of choice, having recently been shown to outperform state-of-the-art end-to-end (E2E) recurrent neural networks (RNNs). Transformers utilize a self-attention mechanism to relate input frames and extract more expressive sequence representations. Transformers also provide parallelism computation and the ability to capture long dependencies in contexts over RNNs. This work introduces a transformer-based model for the online handwriting recognition (OnHWR) task. As the transformer follows encoder-decoder architecture, we investigated the self-attention encoder (SAE) with two different decoders: a self-attention decoder (SAD) and a connectionist temporal classification (CTC) decoder. The proposed models can recognize complete sentences without the need to integrate with external language modules. We tested our proposed mod-els against two Arabic online handwriting datasets: Online-KHATT and CHAW. On evaluation, SAE-SAD architecture per-formed better than SAE-CTC architecture. The SAE-SAD model achieved a 5% character error rate (CER) and an 18%word error rate (WER) against the CHAW dataset, and a 22% CER and a 56% WER against the Online-KHATT dataset. The SAE-SAD model showed significant improvements over existing models of the Arabic OnHWR.
  • 关键词:Selft attention; Transformer; deep Learning; con-nectionist temporal classification; convolutional neural networks; Arabic online handwriting recognition
国家哲学社会科学文献中心版权所有