Published in AI

Google translation tools are creating a legal mess

by on21 April 2021


We did warn you

Translation tools from Google and other companies could be contributing to significant misunderstanding of legal terms according to research due to be presented at an academic workshop.

For a while now we have been pointing out that firms are relying too much on AI-based translation tools in the mistaken belief they are fit for the purpose. We have said that it will take a few being hauled before the courts before they realise that the few cents they save is not worth the piss-poor and inaccurate translations they are getting.

However, now the academics are warning that Google translate is producing howlers which could cause some real problems mostly because it does not understand legal terms.

For example, an English sentence about a court enjoining violence, or banning it, into one in the Indian language of Kannada that implies that the court-ordered violence. "Enjoin" can refer to either promoting or restraining an action. Mistranslations also arise with other contronyms, or words with contradictory meanings depending on context, including "all over", "eventual" and "garnish", the paper said.

To be fair, Google is marketing its machine translation as “a complement to specialised professional translation" and that it is "continually researching improvements, from better handling ambiguous language to mitigating bias, to making large-quality gains for under-resourced languages".

However, that has not stopped companies and some translation agencies from creating a cut-price product called a “machine translation” which is essentially someone taking Google Translate or DeepL and making it readable. The difference in price is half that of a translation so companies are falling for it.

The study's findings add to the scrutiny of automated translations generated by artificial intelligence software.

Researchers previously have found programs that learn translations by studying non-diverse text perpetuate historical gender biases, such as associating "doctor" with "he."

The new paper raises concerns about a popular method company use to broaden the vocabulary of their translation software. They translate foreign text into English and then back into the foreign language, aiming to teach the software to associate similar ways of saying the same phrase.

 

Last modified on 21 April 2021
Rate this item
(5 votes)