Skip to main content
Exploring what’s next in tech – Insights, information, and ideas for today’s IT and business leaders

How AI is changing the way we talk to each other

Machine translation may be the key to preserving human language and culture.

Neural networks have a secret power that is advancing the way machines translate and process languages: They understand metaphors.

"All human languages are metaphor-based," says Senthil Gandhi, a principal scientist in the AI and data practice at HPE Pointnext Services. "Humans can never truly visualize time. It's not possible. But we talk about time all the time." We can do that, he says, because we use metaphors, comparing time to money with phrases such as "Don't waste my time" or "I'm strapped for time."

Gandhi calls neural networks metaphor engines. And while they can't match the linguistic flexibility of the human brain, they soon may be able to.

Machine translation offers the tantalizing prospect of fostering communication across language and cultural barriers. And being able to access the hive mind of the Internet by asking a question of a voice assistant is helping to open the world to preliterate children and people with disabilities that prevent them from reading.

Please read: AI soon to be your BFF and mental health therapist?

Yet, there's much more to the task of machine translation than building a smarter AI. An immense amount of human labor and capital will be needed to create comprehensive libraries to train models that capture the diversity and nuance of human languages. Plus, use cases that aren't easily monetized, such as training AI to understand people with disabilities that alter their speech patterns, may not get the attention they need.

In an example of machine translation's potential, Gandhi and his gardener don't share a common language but can communicate in English and Spanish thanks to Google Translate. "For me, it's a nice thing to have Google Translate," he says. "But it could completely change my gardener's life."

A compelling case for machine translation

Perhaps the most urgent use case for machine translation is to give non-English speakers a way to transact in an English-dominant world. A recent study of a vocational English language program in Boston found that people who completed the course earned an average of $2,621 more per year in the first year, and their income continued to grow after that.

However, it's not always easy to learn a new language. Research that compared the grammar-learning skills of 669,000 native and non-native English speakers determined that our ability to learn a new language drops continuously after age 17. Someone who tries to pick up English later in life may never be comfortable in their second language. For that person, machine translation could become an economic lifeline.

Translation software that allows people with little or no English to navigate jobs and business in English could significantly improve their earning potential. "There may be millions of people whose lives can be drastically changed by machine translation," Gandhi says.

Language barriers

Unfortunately, translation libraries are limited for languages with fewer speakers.

"The fundamental way these algorithms work is the more data, the better," Gandhi says. "And there is more data where there is more money." So the resources for machine translation focus on the more lucrative languages, starting with English, then Chinese and Spanish. Because of the size of the dataset needed to train the models, he worries that it could be decades before a language like his native Tamil is well-translated.

Nuance is where algorithms fall short, and you need human input, which is costly in both time and money. In Tamil, the first translators were missionaries. "They missed a lot of nuances," Gandhi says. "If you don't even have the word for it in your language, where do you go from there?"

As an example he suggests the word snow. Tamil developed near the equator, "so there is no word for snow in my language," he says. Tamil speakers use the word for dew to mean snow, but something gets lost in translation. Contrast that with Inuit or Icelandic languages that have hundreds of words for different kinds of snow. AI can get stuck when it tries to translate these concepts across cultures.

Please read: What Amazon Echo tells us about the future of enterprise IT

Gandhi points out that the Tamil language began to evolve as a separate language as much as 5,000 years ago. "A language carries the experience of a people for 5,000 years," he says. "It's embedded in the words they use. That is getting lost at a rapid pace."

Another major roadblock, at least with current technological capabilities, is energy. About a year ago, Gandhi did a rough calculation of the amount of energy it would take to build a neural network that could match the functioning of the human brain. Our brains contain about 100 billion neurons. The largest neural network has about 200 million parameters. The energy required to make the leap to approximate the power of a human brain would currently be unfathomable. That said, while machine translation will never be perfect, it continues, and will continue, to improve as algorithms mature and nuance becomes embedded in language datasets.

Using NLP to preserve language and culture

"We are getting to a point where the Internet is becoming very monolingual," says Vukosi Marivate, a senior lecturer in the computer science department at the University of Pretoria. Marivate is one of the founders of Masakhane, described as "a grassroots NLP community for Africa, by Africans." Masakhane creates natural language processing tools that are open source, reducing barriers to entry and allowing more people to participate. Since its founding in 2019, the project has grown to more than 1,000 contributors in 35 African countries.

Masakhane contributors are working to put source and target languages into the pipeline and train the models to process various African languages. This is an enormous task: Africa has 2,000 languages, many with limited translation libraries to train the models.

"If we don't build these tools and build these models, it's going to get to the point where it's like these languages never existed," Marivate says. He points out that we lose much more than words when a language is lost because languages express the history and culture of the people who developed them.

Sensitivity to the task of translating culture as well as language is central to the Masakhane project. Marivate shares an African saying—"This cannot be translated"—to explain that sometimes you need to be in the culture before you can understand the language. So Masakhane's contributors include linguists and sociologists as well as data scientists to help capture the cultural significance embedded in the language.

There will probably never be a business case for a company like Google to throw its immense resources into creating the translation libraries for more minor languages, so Gandhi suggests that funding to preserve and uplift this linguistic heritage should come from grants. Masakhane has leveraged grants to fund several projects, including building a translation tool for six African languages, he says, but much more is needed.

Will machine translation ever match human language capabilities?

For now, NLP is an assistant to, not a replacement for, human translators. A recent example of machine translation's shortcomings is the uproar over inaccurate subtitles on Netflix's most popular title ever, the South Korean series "Squid Game." Netflix had teamed up with Virginia Tech to develop automatic preprocessing that improves black-box machine translation. However, the language simplification that makes it easier for AI to generate subtitles can also remove the context you need to understand a story from a different culture.

Gandhi points to another limitation in the AI assistants that share his home: They have a hard time understanding his Tamil-accented English. He must speak sternly to get Google Home or Alexa to understand his commands.

"If you talk to a human like that, it's very rude," he says. Worse, his toddler learned by observation that yelling at the AI assistant was the best way to get an answer. Fortunately, Gandhi isn't worried that giving orders to Google will harm his son's communication skills, because the child quickly learned that he couldn't speak that way to humans.

That's because the human brain, even in children, can handle the complexity of speaking differently to humans versus AI. Researchers in Seattle confirmed this with an experiment that taught children that saying the made-up word bungo would speed up the speech of a simulated AI. In follow-up interactions, the children demonstrated that they understood that humans wouldn't have the same reaction to bungo as the machine did.

"It's very interesting to see the evolution of the relationship between a child and an AI," Gandhi says. "At first, he treated it just as a machine, but then he quickly got that it's a source of truth."

Humans are already equipped to get the most out of voice AI without being harmed by it. And Gandhi thinks that, despite huge advances in the past decade, machine translation and NLP are still in the equivalent of the steam age. That means we have the jet age to look forward to, he says, with the promise of faster and more adept machine translation. It is already changing how we speak to one another, and it stands to be a critical piece in the preservation of the thousands of ways of communicating that humans have developed over millennia.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.