EconPapers    
Economics at your fingertips  
 

Design of English Translation Model Based on Recurrent Neural Network

Xiaohui Wang and Hengchang Jing

Mathematical Problems in Engineering, 2022, vol. 2022, 1-7

Abstract: In order to improve the accuracy and stability of English translation, this paper proposes an English translation model based on recurrent neural network. Based on the end-to-end encoder-decoder architecture, a recursive neural network (RNN) English machine translation model is designed to promote machine autonomous learning features, transform the distributed corpus data into word vectors, and directly map the source language and target language through the recurrent neural network. Selecting semantic errors to construct the objective function during training can well balance the influence of each part of the semantics and fully consider the alignment information, providing a strong guidance for the training of deep recurrent neural networks. The experimental results show that the English translation model based on recurrent neural network has high effectiveness and stability. Compared with the baseline system, it has improved about 1.51–1.86 BLEU scores. Conclusion. The model improves the performance and quality of English machine translation model, and the translation effect is better.

Date: 2022
References: Add references at CitEc
Citations:

Downloads: (external link)
http://downloads.hindawi.com/journals/mpe/2022/5177069.pdf (application/pdf)
http://downloads.hindawi.com/journals/mpe/2022/5177069.xml (application/xml)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:hin:jnlmpe:5177069

DOI: 10.1155/2022/5177069

Access Statistics for this article

More articles in Mathematical Problems in Engineering from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().

 
Page updated 2025-03-19
Handle: RePEc:hin:jnlmpe:5177069