Download PDFOpen PDF in browser

Multi-Head Self-Attention and BGRU for Online Arabic Grapheme Text Segmentation

EasyChair Preprint no. 9695

14 pagesDate: February 14, 2023


The segmentation of online handwritten Arabic text into graphemes/characters is a challenging and crucial task for the recognition system due to the nature of this script. It is better to employ the dependency in the context of segments written before and after the handwritten text to better perform this process. In this paper, we introduce Multi-head self-attention (MHSA) and Bidirectional Gated Recurrent Units (BGRU) models for online handwritten Arabic text segmentation that simulate our previous grapheme segmentation model (GSM). The proposed framework consists of words embedding the input sequence and the combination of complementary Multi-head self-attention and BGRU which help to detect the control points (CPs) for handwritten text segmentation. The CPs delimit each grapheme composed of three main points, such as the starting point (SP), ligature valley point (LVP), and ending point (EP). To show the effectiveness of the proposed MHSA-BGRU model for online handwritten segmentation and its comparison with GSM, the mean absolute error (MAE), and word error rate (WER) evaluation metrics are used. Experimental results on benchmark ADAB and online-KHATT datasets show the efficiency of the proposed model which achieves 2.45% for MAE, 90.05%, and 81.90% WER respectively.

Keyphrases: BGRU, Grapheme segmentation, multi-head self-attention, Online handwriting trajectory, transformer

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Yahia Hamdi and Besma Rabhi and Thameur Dhieb and Adel M. Alimi},
  title = {Multi-Head Self-Attention and BGRU for Online Arabic Grapheme Text Segmentation},
  howpublished = {EasyChair Preprint no. 9695},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser