Google Neural Machine Translation Explained
Google Neural Machine Translation (GNMT) was a neural machine translation (NMT) system developed by Google and introduced in November 2016 that used an artificial neural network to increase fluency and accuracy in Google Translate.[1] The neural network consisted of two main blocks, an encoder and a decoder, both of LSTM architecture with 8 1024-wide layers each and a simple 1-layer 1024-wide feedforward attention mechanism connecting them.[2] The total number of parameters has been variously described as over 160 million,[3]
approximately 210 million,
[4] 278 million
[5] or 380 million.
[6] It used WordPiece
tokenizer, and
beam search decoding strategy. It ran on
Tensor Processing Units.
By 2020, the system had been replaced by another deep learning system based on a Transformer encoder and an RNN decoder.[7]
GNMT improved on the quality of translation by applying an example-based (EBMT) machine translation method in which the system learns from millions of examples of language translation. GNMT's proposed architecture of system learning was first tested on over a hundred languages supported by Google Translate. With the large end-to-end framework, the system learns over time to create better, more natural translations. GNMT attempts to translate whole sentences at a time, rather than just piece by piece. The GNMT network can undertake interlingual machine translation by encoding the semantics of the sentence, rather than by memorizing phrase-to-phrase translations.[8]
History
The Google Brain project was established in 2011 in the "secretive Google X research lab" by Google Fellow Jeff Dean, Google Researcher Greg Corrado, and Stanford University Computer Science professor Andrew Ng.[9] [10] [11] Ng's work has led to some of the biggest breakthroughs at Google and Stanford.[12]
In November 2016, Google Neural Machine Translation system (GNMT) was introduced. Since then, Google Translate began using neural machine translation (NMT) in preference to its previous statistical methods (SMT)[13] [14] which had been used since October 2007, with its proprietary, in-house SMT technology.[15] [16]
Training GNMT was a big effort at the time and took, by a 2021 OpenAI estimate, on the order of 100 PFLOP/s*day (up to 10 FLOPs) of compute which was 1.5 orders of magnitude larger than Seq2seq model of 2014[17] (but about 2x smaller than GPT-J-6B in 2021[18]).
Google Translate's NMT system uses a large artificial neural network capable of deep learning. By using millions of examples, GNMT improves the quality of translation, using broader context to deduce the most relevant translation. The result is then rearranged and adapted to approach grammatically based human language. GNMT's proposed architecture of system learning was first tested on over a hundred languages supported by Google Translate. GNMT did not create its own universal interlingua but rather aimed at finding the commonality between many languages using insights from psychology and linguistics. The new translation engine was first enabled for eight languages: to and from English and French, German, Spanish, Portuguese, Chinese, Japanese, Korean and Turkish in November 2016.[19] In March 2017, three additional languages were enabled: Russian, Hindi and Vietnamese along with Thai for which support was added later.[20] [21] Support for Hebrew and Arabic was also added with help from the Google Translate Community in the same month.[22] In mid April 2017 Google Netherlands announced support for Dutch and other European languages related to English.[23] Further support was added for nine Indian languages: Hindi, Bengali, Marathi, Gujarati, Punjabi, Tamil, Telugu, Malayalam and Kannada at the end of April 2017.[24]
By 2020, Google had changed methodology to use a different neural network system based on transformers, and had phased out NMT.[25]
Evaluation
The GNMT system was said to represent an improvement over the former Google Translate in that it will be able to handle "zero-shot translation", that is it directly translates one language into another (for example, Japanese to Korean). Google Translate previously first translated the source language into English and then translated the English into the target language rather than translating directly from one language to another.[8]
A July 2019 study in Annals of Internal Medicine found that "Google Translate is a viable, accurate tool for translating non–English-language trials". Only one disagreement between reviewers reading machine-translated trials was due to a translation error. Since many medical studies are excluded from systematic reviews because the reviewers do not understand the language, GNMT has the potential to reduce bias and improve accuracy in such reviews.[26]
Languages supported by GNMT
As of December 2021, all of the languages of Google Translate support GNMT, with Latin being the most recent addition.
- Afrikaans
- Albanian
- Amharic
- Arabic
- Armenian
- Azerbaijani
- Basque
- Belarusian
- Bengali
- Bosnian
- Bulgarian
- Burmese
- Catalan
- Cebuano
- Chewa
- Chinese (Simplified)
- Chinese (Traditional)
- Corsican
- Croatian
- Czech
- Danish
- Dutch
- English
- Esperanto
- Estonian
- Filipino (Tagalog)
- Finnish
- French
- Galician
- Georgian
- German
- Greek
- Gujarati
- Haitian Creole
- Hausa
- Hawaiian
- Hebrew
- Hindi
- Hmong
- Hungarian
- Icelandic
- Igbo
- Indonesian
- Irish
- Italian
- Japanese
- Javanese
- Kannada
- Kazakh
- Khmer
- Kinyarwanda
- Korean
- Kurdish (Kurmanji)
- Kyrgyz
- Lao
- Latin
- Latvian
- Lithuanian
- Luxembourgish
- Macedonian
- Malagasy
- Malay
- Malayalam
- Maltese
- Maori
- Marathi
- Mongolian
- Nepali
- Norwegian (Bokmål)
- Odia
- Pashto
- Persian
- Polish
- Portuguese
- Punjabi (Gurmukhi)
- Romanian
- Russian
- Samoan
- Scottish Gaelic
- Serbian
- Shona
- Sindhi
- Sinhala
- Slovak
- Slovenian
- Somali
- Sotho
- Spanish
- Sundanese
- Swahili
- Swedish
- Tajik
- Tamil
- Tatar
- Telugu
- Thai
- Turkish
- Turkmen
- Ukrainian
- Urdu
- Uyghur
- Uzbek
- Vietnamese
- Welsh
- West Frisian
- Xhosa
- Yiddish
- Yoruba
- Zulu
See also
External links
Notes and References
- Google's neural machine translation system: Bridging the gap between human and machine translation. Yonghui. Wu. Mike. Schuster. Zhifeng. Chen. Quoc V.. Le. Mohammad. Norouzi. 1609.08144. 2016. 2016arXiv160908144W.
- Web site: Peeking into the neural network architecture used for Google's Neural Machine Translation .
- 2112.10930 . Qin . Minghai . Zhang . Tianyun . Sun . Fei . Chen . Yen-Kuang . Fardad . Makan . Wang . Yanzhi . Xie . Yuan . Compact Multi-level Sparse Neural Networks with Input Independent Dynamic Rerouting . 2021 . cs.NE .
- Web site: Compression of Google Neural Machine Translation Model – NLP Architect by Intel® AI Lab 0.5.5 documentation .
- 2104.02233 . Langroudi . Hamed F. . Karia . Vedant . Pandit . Tej . Kudithipudi . Dhireesha . TENT: Efficient Quantization of Neural Networks on the tiny Edge with Tapered FixEd PoiNT . 2021 . cs.LG .
- Web site: Data Augmentation | How to use Deep Learning when you have Limited Data . May 19, 2021 .
- Web site: Recent Advances in Google Translate . 2024-05-08 . research.google . en.
- Web site: Christian. Boitet. Hervé. Blanchon. Mark. Seligman. Valérie. Bellynck. MT on and for the Web. 2010. December 1, 2016. March 29, 2017. https://web.archive.org/web/20170329125916/http://www-clips.imag.fr/geta/herve.blanchon/Pdfs/NLP-KE-10.pdf. dead.
- Web site: Jeff Dean and Andrew Ng. Using large-scale brain simulations for machine learning and A.I.. Official Google Blog. January 26, 2015. June 26, 2012.
- Web site: Google's Large Scale Deep Neural Networks Project. . October 25, 2015.
- News: How Many Computers to Identify a Cat? 16,000. Markoff. John. John Markoff. June 25, 2012. February 11, 2014. New York Times.
- Web site: A Chinese Internet Giant Starts to Dream: Baidu is a fixture of online life in China, but it wants to become a global power. Can one of the world's leading artificial intelligence researchers help it challenge Silicon Valley's biggest companies?. January 11, 2017. Robert D. Hof. August 14, 2014. Technology Review.
- News: The New York Times. Gideon. Lewis-Kraus. The Great A.I. Awakening. December 14, 2016. January 11, 2017.
- Web site: Quoc. Le. Mike. Schuster. A Neural Network for Machine Translation, at Production Scale. Google Research Blog. September 27, 2016. December 1, 2016.
- http://googlesystem.blogspot.com/2007/10/google-translate-switches-to-googles.html Google Switches to its Own Translation System
- Web site: Google Translate Drops SYSTRAN for Home-Brewed Translation. Barry Schwartz. Search Engine Land. October 23, 2007.
- Web site: AI and compute .
- Web site: Table of contents . .
- Web site: Barak. Turovsky. Found in translation: More accurate, fluent sentences in Google Translate. The Keyword Google Blog. November 15, 2016. December 1, 2016.
- Web site: Google's smarter, A.I.-powered translation system expands to more languages. Perez. Sarah. March 6, 2017. TechCrunch. Oath Inc..
- Web site: Turovsky. Barak. Higher quality neural translations for a bunch more languages. The Keyword Google Blog. March 6, 2017. March 6, 2017.
- Web site: Google now provides AI-powered translations for Arabic and Hebrew. Novet. Jordan. March 30, 2017. VentureBeat.
- Web site: Grote verbetering voor het Nederlands in Google Translate. Finge. Rachid. April 19, 2017. Google Netherlands Blog. Dutch. Big improvement for Dutch in Google Translate.
- Web site: Making the internet more inclusive in India. Turovsky. Barak. April 25, 2017. The Keyword.
- Web site: Recent Advances in Google Translate . 2024-05-08 . research.google . en.
- Jackson. Jeffrey L. Kuriyama. Akira. Anton. Andreea. Choi. April. Fournier. Jean-Pascal. Geier. Anne-Kathrin. Jacquerioz. Frederique. Kogan. Dmitry. Scholcoff. Cecilia. Sun. Rao. The Accuracy of Google Translate for Abstracting Data From Non–English-Language Trials for Systematic Reviews. Annals of Internal Medicine. July 30, 2019. 171. 9. 678. 10.7326/M19-0891. 31357212. 198980789. 0570-183X.