Wednesday, July 14, 2021

LOCAL TRANSLATION SERVICES FOR NEGLECTED LANGUAGES

Author :  David Noever

Affiliation :  Auburn University

Country :  USA

Category :  Computer Science & Information Technology

Volume, Issue, Month, Year :  11, 01, January, 2021

Abstract :

Taking advantage of computationally lightweight, but high-quality translators prompt consideration of new applications that address neglected languages. For projects with protected or personal data, translators for less popular or low-resource languages require specific compliance checks before posting to a public translation API. In these cases, locally run translators can render reasonable, cost-effective solutions if done with an army of offline, smallscale pair translators. Like handling a specialist’s dialect, this research illustrates translating two historically interesting, but obfuscated languages: 1) hacker-speak (“l33t”) and 2) reverse (or “mirror”) writing as practiced by Leonardo da Vinci. The work generalizes a deep learning architecture to translatable variants of hacker-speak with lite, medium, and hard vocabularies. The original contribution highlights a fluent translator of hacker-speak in under 50 megabytes and demonstrates a companion text generator for augmenting future datasets with greater than a million bilingual sentence pairs. A primary motivation stems from the need to understand and archive the evolution of the international computer community, one that continuously enhances their talent for speaking openly but in hidden contexts. This training of bilingual sentences supports deep learning models using a long short-term memory, recurrent neural network (LSTM-RNN). It extends previous work demonstrating an English-to-foreign translation service built from as little as 10,000 bilingual sentence pairs. This work further solves the equivalent translation problem in twenty-six additional (non-obfuscated) languages and rank orders those models and their proficiency quantitatively with Italian as the most successful and Mandarin Chinese as the most challenging.

Keyword :  Recurrent Neural Network, Long Short-Term Memory (LSTM) Network, Machine Translation, Encoder-Decoder Architecture, Obfuscation.

For More Detailshttps://aircconline.com/csit/papers/vol11/csit110110.pdf

No comments:

Post a Comment