Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features. However, these models still require manually annotated training data, which is not available for many languages. In this paper, we propose an unsupervised cross-lingual NER model that can transfer knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Our model achieves this through end-to-end parameter sharing and adapting to the target domain through fine-tuning. Experiments on four different languages demonstrate the effectiveness of our approach, outperforming existing models by a good margin and setting a new SOTA for each language pair.
Zero-Resource Cross-Lingual Named Entity Recognition
Saiful Bari, Shafiq Joty, and Prathyusha Jwalapuram. In Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI'20) , pages 7415-7423, 2020.
PDF Abstract BibTex Slides