Journal of West African Languages

Overview Overview Search Search Up Up
Download details
Human Evaluation of Yorùbá-English Google Translation Human Evaluation of Yorùbá-English Google Translation

The task of Machine Translation is not just about translating the text of a language to another but also its evaluation so as to monitor its improvement particularly in fluency, accuracyand efficiency. However, the only available free machine translation on Yoruba-English is “Google Translate” which has been observed to be grossly inadequate. This paper therefore examines translations done by Google Translate as against human translation in order to investigate why machine translation applications make some errors while translating human natural language. There are many matrix evaluators to do this. This paper adopts human evaluation also known as manual evaluation which is considered to be more efficient, but costly. The paper adopts Ibadan and Akungba Structured Sentence Paradigm to evaluate the translators (Google Translate and human). The translations were sent to twenty human evaluatorsout of which only eleven responded. The responses were subjected to statistical analysis. Findings show that human translation fares better in terms of accuracy and fluency which are informed by the quality and the quantity of training data. This paper suggests that more data, especially literary texts, should be acquired to train the translator for general efficiency and fluency.




Data

Volume Number 43.1
Topic #1 Translation
Author This email address is being protected from spambots. You need JavaScript enabled to view it.
Language English
Language Family Other Benue-Congo


Size 510.73 KB
Downloads 852

Click on the button to go back to the summary where you can download this file.

Download