Home / Journals American Journal of Artificial Intelligence / Machine Translation for Low-Resource Languages
Machine Translation for Low-Resource Languages
Submission DeadlineMay 31, 2020

Submission Guidelines: http://www.sciencepublishinggroup.com/home/submission

Lead Guest Editor
Benyamin Ahmadnia
Department of Computer Science, Tulane University, New Orleans, USA
Guest Editors
  • Bonnie J Dorr
    Institute for Human and Machine Cognition (IHMC), Ocala, USA
  • Hossein Sarrafzadeh
    Department of Cybersecurity, St. Bonaventure University, St. Bonaventure, USA
  • Javier Serrano
    Department of Telecommunications and Systems Engineering, Universitat Autonoma de Barcelona, Cerdanyola del Valles, Spain
  • Mahsa Mohaghegh
    School of Engineering, Computer, and Mathematical Sciences, Auckland University of Technology, Auckland, New Zealand
  • Mojtaba Sabbagh-Jafari
    Department of Computer Engineering, Vali-e-Asr University of Rafsanjan, Rafsanjan, Iran
  • Pariya Razmdideh
    Department of Linguistics and Translation Studies, Vali-e-Asr University of Rafsanjan, Rafsanjan, Iran
The biggest issue with low-resource languages is the extreme difficulty of obtaining enough resources. Machine Translation (MT) has proven successful for several language pairs. However, each language comes with its own challenges. Low-resource languages have largely been left out of the MT revolution. In low-resource languages there are often very few written texts and of those that exist, they do not have a parallel text in another language. MT has made significant progress in recent years with a shift to statistical and neural models and rapid development of new architectures such as the transformer. However, current models trained on little parallel data tend to produce poor quality translations and without the parallel texts, statistical or neural MT will give subpar results. This challenge is exacerbated in the context of social media, where we need to enable communication for languages with no corresponding parallel corpora or unofficial languages. We are pleased to invite the academic community to respond to this issue on low-resource MT.
Research topic should be relevant to low-resource MT, including, but not limited to: Unsupervised statistical or neural MT for low-resource language pairs. Semi-supervised statistical or neural MT for low-resource language pairs. Pretraining methods leveraging monolingual data. Multilingual statistical or neural MT for low-resource languages.
Aims and Scope:
  1. Low-resource Languages
  2. Statistical Machine Translation
  3. Neural Machine Translation
  4. Active Learning
  5. Unsupervised Learning
  6. Semi-supervised Learning
  7. Dual Learning
  8. Round-tripping
  9. Bridge Language
  10. Bootstrapping
Guidelines for Submission
Manuscripts should be formatted according to the guidelines for authors
(see: http://www.sciencepublishinggroup.com/journal/guideforauthors?journalid=542).

Please download the template to format your manuscript.

Published Papers
Authors: Tianyi Xu, Ozge Ilkim Ozbek, Shannon Marks, Sri Korrapati, Benyamin Ahmadnia
Pages: 42-49 Published Online: Jul. 23, 2020
DOI: 10.11648/j.ajai.20200402.11
Views 113 Downloads 43
Science Publishing Group
1 Rockefeller Plaza,
10th and 11th Floors,
New York, NY 10020
Tel: (001)347-983-5186