Khaber, Mohamed IslamFrahta, NabilaMoussaoui, AbdelouahabSaidi, Mohamed2024-03-122024-03-122021-05-25978-9931-9788-0-0http://dspace.univ-oeb.dz:4000/handle/123456789/18721In recent years, the Transformers have emerged as the most relevant deep architecture, especially machine translation. These models, which are based on attention mechanisms, outperformed previous neural machine translation architectures in several tasks. This paper proposes a new architecture based on the transformer model for the monolingual and multilingual translation system. The tests were carried out on the IWSLT 2015 and 2016 dataset. The Transformers attention mechanism increases the accuracy to more than 92% that we can quantify by more than 4 BLEU points (a performance metric used in machine translation systems).enDeep Neural Transformer Model for Mono and Multi Lingual Machine TranslationArticle