Neural machine translation (NMT) is an approach to machine translation in which a large neural network is trained by deep learning techniques. It is a radical departure from phrase-based statistical translation approaches, in which a translation system consists of subcomponents that are separately engineered. Google has announced that its translation services are now using Google Neural Machine Translation (GNMT) in preference to its previous statistical methods.
NMT models apply deep representation learning. They require only a fraction of the memory needed by traditional statistical machine translation (SMT) models. Furthermore, unlike conventional translation systems, all parts of the neural translation model are trained jointly (end-to-end)) to maximize the translation performance.
A bidirectional recurrent neural network (RNN), known as an encoder, is used by the neural network to encode a source sentence for a second RNN, known as a decoder, that is used to predict words in the target language.