Download PDFOpen PDF in browser

A Self-Attention Based Model for Offline Handwritten Text Recognition

EasyChair Preprint no. 7037

14 pagesDate: November 11, 2021

Abstract

Offline handwritten recognition is an important part of handwritten document analysis systems and has received a lot of attention from numerous researchers for decades. In this paper, we present a self-attention-based model for offline handwritten text line recognition. The proposed model consists of three main components: a feature extractor by CNN, an encoder by BLSTM and a self-attention module, and a decoder by CTC. The self-attention module is complementary to RNN in the decoder and helps it to capture long-range, multi-level dependencies across the input sequence. According to the extensive experiments on the two datasets of IAM Handwriting, and Kuzushiji, the proposed model achieves similar or better accuracy than the state-of-the-art models. The self-attention map visualization shows that the self-attention mechanism helps the decoder capture the dependencies between different positions of arbitrarily distance in the input sequence.

Keyphrases: BLSTM, CNN, CTC, Handwritten Text Recognition, multi-head, self-attention

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:7037,
  author = {Nam Tuan Ly and Trung Tan Ngo and Masaki Nakagawa},
  title = {A Self-Attention Based Model for Offline Handwritten Text Recognition},
  howpublished = {EasyChair Preprint no. 7037},

  year = {EasyChair, 2021}}
Download PDFOpen PDF in browser