Deep Neural Transformer Model for Mono and Multi Lingual Machine Translation

dc.contributor.authorKhaber, Mohamed Islam
dc.contributor.authorFrahta, Nabila
dc.contributor.authorMoussaoui, Abdelouahab
dc.contributor.authorSaidi, Mohamed
dc.date.accessioned2024-03-12T17:34:45Z
dc.date.available2024-03-12T17:34:45Z
dc.date.issued2021-05-25
dc.description.abstractIn recent years, the Transformers have emerged as the most relevant deep architecture, especially machine translation. These models, which are based on attention mechanisms, outperformed previous neural machine translation architectures in several tasks. This paper proposes a new architecture based on the transformer model for the monolingual and multilingual translation system. The tests were carried out on the IWSLT 2015 and 2016 dataset. The Transformers attention mechanism increases the accuracy to more than 92% that we can quantify by more than 4 BLEU points (a performance metric used in machine translation systems).
dc.identifier.isbn978-9931-9788-0-0
dc.identifier.urihttp://dspace.univ-oeb.dz:4000/handle/123456789/18721
dc.language.isoen
dc.publisherUniversity of Oum El Bouaghi
dc.titleDeep Neural Transformer Model for Mono and Multi Lingual Machine Translation
dc.typeArticle
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Deep Neural Transformer Model for Mono and Multi Lingual Machine Translation.pdf
Size:
441.28 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: