Google’s AI research continues to find new ways to encourage computer thinking and this week the Google Brain Neural Networks AI seems to have done something that would have been largely impossible not that long ago. According to recent reports, the Google AI has learned enough about language to create its own “universal language,” or internal linguistic representation of similar phrases and sentences called “interlingua.”
It is important to note that this is not something you can just learn or teach. It is the result of the AI writing its own algorithms for cataloging information. The researchers comment, “This means the network must be encoding something about the semantics of the sentence rather than simply memorizing phrase-to-phrase translations. We interpret this as a sign of existence of an interlingua in the network.”
In one of the experiments, the researchers combined 12 language pairs in one model that would be the same size as just a single language pair. Even with the remarkably reduced code base, they found that the computer managed “only slightly lower translation quality” than what they would have expected compared with a dedicated two-language model. Thus, they explain, “Our approach has been shown to work reliably in a Google-scale production setting and enables us to scale to a large number of languages quickly.”
Of course, it is important also to note that they were only able to find this after they had been working on AI for languages for some time. Still, the progress is rapid, implying more growth to come. The researchers comment, “To our knowledge, this is the first demonstration of true multilingual zero-shot translation.”
Basically, the Google Brain interlingua uses lessons it learns from all previous activity to note commonalities between phrases and language in order to deduce what it does not already know. In essence, it encodes semantics over verbatim translations.
Theoretically, then, that means Google Brain’s system can, in fact, learn any language autonomously, with “reasonable” results. For now, of course, that is not necessarily a lot but in time this could prove quite a formidable tool.
The researchers explain: “The described Multilingual Google Neural Machine Translation system is running in production today for all Google Translate users. Multilingual systems are currently used to serve 10 of the recently launched 16 language pairs, resulting in improved quality and a simplified production architecture.”