Learning Spatial-Semantic Context with Fully Convolutional Recurrent Network for Online Handwritten Chinese Text Recognition
Online handwritten Chinese text recognition (OHCTR) is a challenging problem as it involves a large-scale character set, ambiguous segmentation, and variable-length input sequences. In this paper, we exploit the outstanding capability of path signature to translate online pen-tip trajectories into i...
| Veröffentlicht in: | IEEE transactions on pattern analysis and machine intelligence. - 1979. - 40(2018), 8 vom: 03. Aug., Seite 1903-1917 |
|---|---|
| 1. Verfasser: | |
| Weitere Verfasser: | , , , |
| Format: | Online-Aufsatz |
| Sprache: | English |
| Veröffentlicht: |
2018
|
| Zugriff auf das übergeordnete Werk: | IEEE transactions on pattern analysis and machine intelligence |
| Schlagworte: | Journal Article Research Support, Non-U.S. Gov't |
| Zusammenfassung: | Online handwritten Chinese text recognition (OHCTR) is a challenging problem as it involves a large-scale character set, ambiguous segmentation, and variable-length input sequences. In this paper, we exploit the outstanding capability of path signature to translate online pen-tip trajectories into informative signature feature maps, successfully capturing the analytic and geometric properties of pen strokes with strong local invariance and robustness. A multi-spatial-context fully convolutional recurrent network (MC-FCRN) is proposed to exploit the multiple spatial contexts from the signature feature maps and generate a prediction sequence while completely avoiding the difficult segmentation problem. Furthermore, an implicit language model is developed to make predictions based on semantic context within a predicting feature sequence, providing a new perspective for incorporating lexicon constraints and prior knowledge about a certain language in the recognition procedure. Experiments on two standard benchmarks, Dataset-CASIA and Dataset-ICDAR, yielded outstanding results, with correct rates of 97.50 and 96.58 percent, respectively, which are significantly better than the best result reported thus far in the literature |
|---|---|
| Beschreibung: | Date Revised 20.11.2019 published: Print-Electronic Citation Status PubMed-not-MEDLINE |
| ISSN: | 1939-3539 |
| DOI: | 10.1109/TPAMI.2017.2732978 |